Oct  2 06:46:48 np0005466012 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  2 06:46:48 np0005466012 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  2 06:46:48 np0005466012 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:46:48 np0005466012 kernel: BIOS-provided physical RAM map:
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  2 06:46:48 np0005466012 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  2 06:46:48 np0005466012 kernel: NX (Execute Disable) protection: active
Oct  2 06:46:48 np0005466012 kernel: APIC: Static calls initialized
Oct  2 06:46:48 np0005466012 kernel: SMBIOS 2.8 present.
Oct  2 06:46:48 np0005466012 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  2 06:46:48 np0005466012 kernel: Hypervisor detected: KVM
Oct  2 06:46:48 np0005466012 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  2 06:46:48 np0005466012 kernel: kvm-clock: using sched offset of 6421266442 cycles
Oct  2 06:46:48 np0005466012 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  2 06:46:48 np0005466012 kernel: tsc: Detected 2799.998 MHz processor
Oct  2 06:46:48 np0005466012 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  2 06:46:48 np0005466012 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  2 06:46:48 np0005466012 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  2 06:46:48 np0005466012 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  2 06:46:48 np0005466012 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  2 06:46:48 np0005466012 kernel: Using GB pages for direct mapping
Oct  2 06:46:48 np0005466012 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  2 06:46:48 np0005466012 kernel: ACPI: Early table checksum verification disabled
Oct  2 06:46:48 np0005466012 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  2 06:46:48 np0005466012 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:48 np0005466012 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:48 np0005466012 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:48 np0005466012 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  2 06:46:48 np0005466012 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:48 np0005466012 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  2 06:46:48 np0005466012 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  2 06:46:48 np0005466012 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  2 06:46:48 np0005466012 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  2 06:46:48 np0005466012 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  2 06:46:48 np0005466012 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  2 06:46:48 np0005466012 kernel: No NUMA configuration found
Oct  2 06:46:48 np0005466012 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  2 06:46:48 np0005466012 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct  2 06:46:48 np0005466012 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  2 06:46:48 np0005466012 kernel: Zone ranges:
Oct  2 06:46:48 np0005466012 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  2 06:46:48 np0005466012 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  2 06:46:48 np0005466012 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  2 06:46:48 np0005466012 kernel:  Device   empty
Oct  2 06:46:48 np0005466012 kernel: Movable zone start for each node
Oct  2 06:46:48 np0005466012 kernel: Early memory node ranges
Oct  2 06:46:48 np0005466012 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  2 06:46:48 np0005466012 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  2 06:46:48 np0005466012 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  2 06:46:48 np0005466012 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  2 06:46:48 np0005466012 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  2 06:46:48 np0005466012 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  2 06:46:48 np0005466012 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  2 06:46:48 np0005466012 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  2 06:46:48 np0005466012 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  2 06:46:48 np0005466012 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  2 06:46:48 np0005466012 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  2 06:46:48 np0005466012 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  2 06:46:48 np0005466012 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  2 06:46:48 np0005466012 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  2 06:46:48 np0005466012 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  2 06:46:48 np0005466012 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  2 06:46:48 np0005466012 kernel: TSC deadline timer available
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Max. logical packages:   8
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Max. logical dies:       8
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Max. dies per package:   1
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Max. threads per core:   1
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Num. cores per package:     1
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Num. threads per package:   1
Oct  2 06:46:48 np0005466012 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  2 06:46:48 np0005466012 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  2 06:46:48 np0005466012 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  2 06:46:48 np0005466012 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  2 06:46:48 np0005466012 kernel: Booting paravirtualized kernel on KVM
Oct  2 06:46:48 np0005466012 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  2 06:46:48 np0005466012 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  2 06:46:48 np0005466012 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  2 06:46:48 np0005466012 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  2 06:46:48 np0005466012 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:46:48 np0005466012 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  2 06:46:48 np0005466012 kernel: random: crng init done
Oct  2 06:46:48 np0005466012 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: Fallback order for Node 0: 0 
Oct  2 06:46:48 np0005466012 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  2 06:46:48 np0005466012 kernel: Policy zone: Normal
Oct  2 06:46:48 np0005466012 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  2 06:46:48 np0005466012 kernel: software IO TLB: area num 8.
Oct  2 06:46:48 np0005466012 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  2 06:46:48 np0005466012 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  2 06:46:48 np0005466012 kernel: ftrace: allocated 193 pages with 3 groups
Oct  2 06:46:48 np0005466012 kernel: Dynamic Preempt: voluntary
Oct  2 06:46:48 np0005466012 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  2 06:46:48 np0005466012 kernel: rcu: #011RCU event tracing is enabled.
Oct  2 06:46:48 np0005466012 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  2 06:46:48 np0005466012 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  2 06:46:48 np0005466012 kernel: #011Rude variant of Tasks RCU enabled.
Oct  2 06:46:48 np0005466012 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  2 06:46:48 np0005466012 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  2 06:46:48 np0005466012 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  2 06:46:48 np0005466012 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:46:48 np0005466012 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:46:48 np0005466012 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  2 06:46:48 np0005466012 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  2 06:46:48 np0005466012 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  2 06:46:48 np0005466012 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  2 06:46:48 np0005466012 kernel: Console: colour VGA+ 80x25
Oct  2 06:46:48 np0005466012 kernel: printk: console [ttyS0] enabled
Oct  2 06:46:48 np0005466012 kernel: ACPI: Core revision 20230331
Oct  2 06:46:48 np0005466012 kernel: APIC: Switch to symmetric I/O mode setup
Oct  2 06:46:48 np0005466012 kernel: x2apic enabled
Oct  2 06:46:48 np0005466012 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  2 06:46:48 np0005466012 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  2 06:46:48 np0005466012 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Oct  2 06:46:48 np0005466012 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  2 06:46:48 np0005466012 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  2 06:46:48 np0005466012 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  2 06:46:48 np0005466012 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  2 06:46:48 np0005466012 kernel: Spectre V2 : Mitigation: Retpolines
Oct  2 06:46:48 np0005466012 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  2 06:46:48 np0005466012 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  2 06:46:48 np0005466012 kernel: RETBleed: Mitigation: untrained return thunk
Oct  2 06:46:48 np0005466012 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  2 06:46:48 np0005466012 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  2 06:46:48 np0005466012 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  2 06:46:48 np0005466012 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  2 06:46:48 np0005466012 kernel: x86/bugs: return thunk changed
Oct  2 06:46:48 np0005466012 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  2 06:46:48 np0005466012 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  2 06:46:48 np0005466012 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  2 06:46:48 np0005466012 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  2 06:46:48 np0005466012 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  2 06:46:48 np0005466012 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  2 06:46:48 np0005466012 kernel: Freeing SMP alternatives memory: 40K
Oct  2 06:46:48 np0005466012 kernel: pid_max: default: 32768 minimum: 301
Oct  2 06:46:48 np0005466012 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  2 06:46:48 np0005466012 kernel: landlock: Up and running.
Oct  2 06:46:48 np0005466012 kernel: Yama: becoming mindful.
Oct  2 06:46:48 np0005466012 kernel: SELinux:  Initializing.
Oct  2 06:46:48 np0005466012 kernel: LSM support for eBPF active
Oct  2 06:46:48 np0005466012 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  2 06:46:48 np0005466012 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  2 06:46:48 np0005466012 kernel: ... version:                0
Oct  2 06:46:48 np0005466012 kernel: ... bit width:              48
Oct  2 06:46:48 np0005466012 kernel: ... generic registers:      6
Oct  2 06:46:48 np0005466012 kernel: ... value mask:             0000ffffffffffff
Oct  2 06:46:48 np0005466012 kernel: ... max period:             00007fffffffffff
Oct  2 06:46:48 np0005466012 kernel: ... fixed-purpose events:   0
Oct  2 06:46:48 np0005466012 kernel: ... event mask:             000000000000003f
Oct  2 06:46:48 np0005466012 kernel: signal: max sigframe size: 1776
Oct  2 06:46:48 np0005466012 kernel: rcu: Hierarchical SRCU implementation.
Oct  2 06:46:48 np0005466012 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  2 06:46:48 np0005466012 kernel: smp: Bringing up secondary CPUs ...
Oct  2 06:46:48 np0005466012 kernel: smpboot: x86: Booting SMP configuration:
Oct  2 06:46:48 np0005466012 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  2 06:46:48 np0005466012 kernel: smp: Brought up 1 node, 8 CPUs
Oct  2 06:46:48 np0005466012 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Oct  2 06:46:48 np0005466012 kernel: node 0 deferred pages initialised in 21ms
Oct  2 06:46:48 np0005466012 kernel: Memory: 7765416K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616504K reserved, 0K cma-reserved)
Oct  2 06:46:48 np0005466012 kernel: devtmpfs: initialized
Oct  2 06:46:48 np0005466012 kernel: x86/mm: Memory block size: 128MB
Oct  2 06:46:48 np0005466012 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  2 06:46:48 np0005466012 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: pinctrl core: initialized pinctrl subsystem
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  2 06:46:48 np0005466012 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  2 06:46:48 np0005466012 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  2 06:46:48 np0005466012 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  2 06:46:48 np0005466012 kernel: audit: initializing netlink subsys (disabled)
Oct  2 06:46:48 np0005466012 kernel: audit: type=2000 audit(1759402006.494:1): state=initialized audit_enabled=0 res=1
Oct  2 06:46:48 np0005466012 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  2 06:46:48 np0005466012 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  2 06:46:48 np0005466012 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  2 06:46:48 np0005466012 kernel: cpuidle: using governor menu
Oct  2 06:46:48 np0005466012 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  2 06:46:48 np0005466012 kernel: PCI: Using configuration type 1 for base access
Oct  2 06:46:48 np0005466012 kernel: PCI: Using configuration type 1 for extended access
Oct  2 06:46:48 np0005466012 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  2 06:46:48 np0005466012 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  2 06:46:48 np0005466012 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  2 06:46:48 np0005466012 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  2 06:46:48 np0005466012 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  2 06:46:48 np0005466012 kernel: Demotion targets for Node 0: null
Oct  2 06:46:48 np0005466012 kernel: cryptd: max_cpu_qlen set to 1000
Oct  2 06:46:48 np0005466012 kernel: ACPI: Added _OSI(Module Device)
Oct  2 06:46:48 np0005466012 kernel: ACPI: Added _OSI(Processor Device)
Oct  2 06:46:48 np0005466012 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  2 06:46:48 np0005466012 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  2 06:46:48 np0005466012 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  2 06:46:48 np0005466012 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  2 06:46:48 np0005466012 kernel: ACPI: Interpreter enabled
Oct  2 06:46:48 np0005466012 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  2 06:46:48 np0005466012 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  2 06:46:48 np0005466012 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  2 06:46:48 np0005466012 kernel: PCI: Using E820 reservations for host bridge windows
Oct  2 06:46:48 np0005466012 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  2 06:46:48 np0005466012 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  2 06:46:48 np0005466012 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [3] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [4] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [5] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [6] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [7] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [8] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [9] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [10] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [11] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [12] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [13] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [14] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [15] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [16] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [17] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [18] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [19] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [20] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [21] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [22] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [23] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [24] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [25] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [26] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [27] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [28] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [29] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [30] registered
Oct  2 06:46:48 np0005466012 kernel: acpiphp: Slot [31] registered
Oct  2 06:46:48 np0005466012 kernel: PCI host bridge to bus 0000:00
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  2 06:46:48 np0005466012 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  2 06:46:48 np0005466012 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  2 06:46:48 np0005466012 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  2 06:46:48 np0005466012 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  2 06:46:48 np0005466012 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  2 06:46:48 np0005466012 kernel: iommu: Default domain type: Translated
Oct  2 06:46:48 np0005466012 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  2 06:46:48 np0005466012 kernel: SCSI subsystem initialized
Oct  2 06:46:48 np0005466012 kernel: ACPI: bus type USB registered
Oct  2 06:46:48 np0005466012 kernel: usbcore: registered new interface driver usbfs
Oct  2 06:46:48 np0005466012 kernel: usbcore: registered new interface driver hub
Oct  2 06:46:48 np0005466012 kernel: usbcore: registered new device driver usb
Oct  2 06:46:48 np0005466012 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  2 06:46:48 np0005466012 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  2 06:46:48 np0005466012 kernel: PTP clock support registered
Oct  2 06:46:48 np0005466012 kernel: EDAC MC: Ver: 3.0.0
Oct  2 06:46:48 np0005466012 kernel: NetLabel: Initializing
Oct  2 06:46:48 np0005466012 kernel: NetLabel:  domain hash size = 128
Oct  2 06:46:48 np0005466012 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  2 06:46:48 np0005466012 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  2 06:46:48 np0005466012 kernel: PCI: Using ACPI for IRQ routing
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  2 06:46:48 np0005466012 kernel: vgaarb: loaded
Oct  2 06:46:48 np0005466012 kernel: clocksource: Switched to clocksource kvm-clock
Oct  2 06:46:48 np0005466012 kernel: VFS: Disk quotas dquot_6.6.0
Oct  2 06:46:48 np0005466012 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  2 06:46:48 np0005466012 kernel: pnp: PnP ACPI init
Oct  2 06:46:48 np0005466012 kernel: pnp: PnP ACPI: found 5 devices
Oct  2 06:46:48 np0005466012 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_INET protocol family
Oct  2 06:46:48 np0005466012 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  2 06:46:48 np0005466012 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_XDP protocol family
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  2 06:46:48 np0005466012 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  2 06:46:48 np0005466012 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  2 06:46:48 np0005466012 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 84925 usecs
Oct  2 06:46:48 np0005466012 kernel: PCI: CLS 0 bytes, default 64
Oct  2 06:46:48 np0005466012 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  2 06:46:48 np0005466012 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  2 06:46:48 np0005466012 kernel: ACPI: bus type thunderbolt registered
Oct  2 06:46:48 np0005466012 kernel: Trying to unpack rootfs image as initramfs...
Oct  2 06:46:48 np0005466012 kernel: Initialise system trusted keyrings
Oct  2 06:46:48 np0005466012 kernel: Key type blacklist registered
Oct  2 06:46:48 np0005466012 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  2 06:46:48 np0005466012 kernel: zbud: loaded
Oct  2 06:46:48 np0005466012 kernel: integrity: Platform Keyring initialized
Oct  2 06:46:48 np0005466012 kernel: integrity: Machine keyring initialized
Oct  2 06:46:48 np0005466012 kernel: Freeing initrd memory: 86104K
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_ALG protocol family
Oct  2 06:46:48 np0005466012 kernel: xor: automatically using best checksumming function   avx       
Oct  2 06:46:48 np0005466012 kernel: Key type asymmetric registered
Oct  2 06:46:48 np0005466012 kernel: Asymmetric key parser 'x509' registered
Oct  2 06:46:48 np0005466012 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  2 06:46:48 np0005466012 kernel: io scheduler mq-deadline registered
Oct  2 06:46:48 np0005466012 kernel: io scheduler kyber registered
Oct  2 06:46:48 np0005466012 kernel: io scheduler bfq registered
Oct  2 06:46:48 np0005466012 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  2 06:46:48 np0005466012 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  2 06:46:48 np0005466012 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  2 06:46:48 np0005466012 kernel: ACPI: button: Power Button [PWRF]
Oct  2 06:46:48 np0005466012 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  2 06:46:48 np0005466012 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  2 06:46:48 np0005466012 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  2 06:46:48 np0005466012 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  2 06:46:48 np0005466012 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  2 06:46:48 np0005466012 kernel: Non-volatile memory driver v1.3
Oct  2 06:46:48 np0005466012 kernel: rdac: device handler registered
Oct  2 06:46:48 np0005466012 kernel: hp_sw: device handler registered
Oct  2 06:46:48 np0005466012 kernel: emc: device handler registered
Oct  2 06:46:48 np0005466012 kernel: alua: device handler registered
Oct  2 06:46:48 np0005466012 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  2 06:46:48 np0005466012 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  2 06:46:48 np0005466012 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  2 06:46:48 np0005466012 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  2 06:46:48 np0005466012 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  2 06:46:48 np0005466012 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  2 06:46:48 np0005466012 kernel: usb usb1: Product: UHCI Host Controller
Oct  2 06:46:48 np0005466012 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  2 06:46:48 np0005466012 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  2 06:46:48 np0005466012 kernel: hub 1-0:1.0: USB hub found
Oct  2 06:46:48 np0005466012 kernel: hub 1-0:1.0: 2 ports detected
Oct  2 06:46:48 np0005466012 kernel: usbcore: registered new interface driver usbserial_generic
Oct  2 06:46:48 np0005466012 kernel: usbserial: USB Serial support registered for generic
Oct  2 06:46:48 np0005466012 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  2 06:46:48 np0005466012 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  2 06:46:48 np0005466012 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  2 06:46:48 np0005466012 kernel: mousedev: PS/2 mouse device common for all mice
Oct  2 06:46:48 np0005466012 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  2 06:46:48 np0005466012 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  2 06:46:48 np0005466012 kernel: rtc_cmos 00:04: registered as rtc0
Oct  2 06:46:48 np0005466012 kernel: rtc_cmos 00:04: setting system clock to 2025-10-02T10:46:47 UTC (1759402007)
Oct  2 06:46:48 np0005466012 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  2 06:46:48 np0005466012 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  2 06:46:48 np0005466012 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  2 06:46:48 np0005466012 kernel: usbcore: registered new interface driver usbhid
Oct  2 06:46:48 np0005466012 kernel: usbhid: USB HID core driver
Oct  2 06:46:48 np0005466012 kernel: drop_monitor: Initializing network drop monitor service
Oct  2 06:46:48 np0005466012 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  2 06:46:48 np0005466012 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  2 06:46:48 np0005466012 kernel: Initializing XFRM netlink socket
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_INET6 protocol family
Oct  2 06:46:48 np0005466012 kernel: Segment Routing with IPv6
Oct  2 06:46:48 np0005466012 kernel: NET: Registered PF_PACKET protocol family
Oct  2 06:46:48 np0005466012 kernel: mpls_gso: MPLS GSO support
Oct  2 06:46:48 np0005466012 kernel: IPI shorthand broadcast: enabled
Oct  2 06:46:48 np0005466012 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  2 06:46:48 np0005466012 kernel: AES CTR mode by8 optimization enabled
Oct  2 06:46:48 np0005466012 kernel: sched_clock: Marking stable (1213007473, 154222375)->(1456898410, -89668562)
Oct  2 06:46:48 np0005466012 kernel: registered taskstats version 1
Oct  2 06:46:48 np0005466012 kernel: Loading compiled-in X.509 certificates
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  2 06:46:48 np0005466012 kernel: Demotion targets for Node 0: null
Oct  2 06:46:48 np0005466012 kernel: page_owner is disabled
Oct  2 06:46:48 np0005466012 kernel: Key type .fscrypt registered
Oct  2 06:46:48 np0005466012 kernel: Key type fscrypt-provisioning registered
Oct  2 06:46:48 np0005466012 kernel: Key type big_key registered
Oct  2 06:46:48 np0005466012 kernel: Key type encrypted registered
Oct  2 06:46:48 np0005466012 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  2 06:46:48 np0005466012 kernel: Loading compiled-in module X.509 certificates
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  2 06:46:48 np0005466012 kernel: ima: Allocated hash algorithm: sha256
Oct  2 06:46:48 np0005466012 kernel: ima: No architecture policies found
Oct  2 06:46:48 np0005466012 kernel: evm: Initialising EVM extended attributes:
Oct  2 06:46:48 np0005466012 kernel: evm: security.selinux
Oct  2 06:46:48 np0005466012 kernel: evm: security.SMACK64 (disabled)
Oct  2 06:46:48 np0005466012 kernel: evm: security.SMACK64EXEC (disabled)
Oct  2 06:46:48 np0005466012 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  2 06:46:48 np0005466012 kernel: evm: security.SMACK64MMAP (disabled)
Oct  2 06:46:48 np0005466012 kernel: evm: security.apparmor (disabled)
Oct  2 06:46:48 np0005466012 kernel: evm: security.ima
Oct  2 06:46:48 np0005466012 kernel: evm: security.capability
Oct  2 06:46:48 np0005466012 kernel: evm: HMAC attrs: 0x1
Oct  2 06:46:48 np0005466012 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  2 06:46:48 np0005466012 kernel: Running certificate verification RSA selftest
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  2 06:46:48 np0005466012 kernel: Running certificate verification ECDSA selftest
Oct  2 06:46:48 np0005466012 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  2 06:46:48 np0005466012 kernel: clk: Disabling unused clocks
Oct  2 06:46:48 np0005466012 kernel: Freeing unused decrypted memory: 2028K
Oct  2 06:46:48 np0005466012 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  2 06:46:48 np0005466012 kernel: Write protecting the kernel read-only data: 30720k
Oct  2 06:46:48 np0005466012 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  2 06:46:48 np0005466012 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  2 06:46:48 np0005466012 kernel: Run /init as init process
Oct  2 06:46:48 np0005466012 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  2 06:46:48 np0005466012 systemd: Detected virtualization kvm.
Oct  2 06:46:48 np0005466012 systemd: Detected architecture x86-64.
Oct  2 06:46:48 np0005466012 systemd: Running in initrd.
Oct  2 06:46:48 np0005466012 systemd: No hostname configured, using default hostname.
Oct  2 06:46:48 np0005466012 systemd: Hostname set to <localhost>.
Oct  2 06:46:48 np0005466012 systemd: Initializing machine ID from VM UUID.
Oct  2 06:46:48 np0005466012 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  2 06:46:48 np0005466012 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  2 06:46:48 np0005466012 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  2 06:46:48 np0005466012 kernel: usb 1-1: Manufacturer: QEMU
Oct  2 06:46:48 np0005466012 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  2 06:46:48 np0005466012 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  2 06:46:48 np0005466012 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  2 06:46:48 np0005466012 systemd: Queued start job for default target Initrd Default Target.
Oct  2 06:46:48 np0005466012 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  2 06:46:48 np0005466012 systemd: Reached target Local Encrypted Volumes.
Oct  2 06:46:48 np0005466012 systemd: Reached target Initrd /usr File System.
Oct  2 06:46:48 np0005466012 systemd: Reached target Local File Systems.
Oct  2 06:46:48 np0005466012 systemd: Reached target Path Units.
Oct  2 06:46:48 np0005466012 systemd: Reached target Slice Units.
Oct  2 06:46:48 np0005466012 systemd: Reached target Swaps.
Oct  2 06:46:48 np0005466012 systemd: Reached target Timer Units.
Oct  2 06:46:48 np0005466012 systemd: Listening on D-Bus System Message Bus Socket.
Oct  2 06:46:48 np0005466012 systemd: Listening on Journal Socket (/dev/log).
Oct  2 06:46:48 np0005466012 systemd: Listening on Journal Socket.
Oct  2 06:46:48 np0005466012 systemd: Listening on udev Control Socket.
Oct  2 06:46:48 np0005466012 systemd: Listening on udev Kernel Socket.
Oct  2 06:46:48 np0005466012 systemd: Reached target Socket Units.
Oct  2 06:46:48 np0005466012 systemd: Starting Create List of Static Device Nodes...
Oct  2 06:46:48 np0005466012 systemd: Starting Journal Service...
Oct  2 06:46:48 np0005466012 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  2 06:46:48 np0005466012 systemd: Starting Apply Kernel Variables...
Oct  2 06:46:48 np0005466012 systemd: Starting Create System Users...
Oct  2 06:46:48 np0005466012 systemd: Starting Setup Virtual Console...
Oct  2 06:46:48 np0005466012 systemd: Finished Create List of Static Device Nodes.
Oct  2 06:46:48 np0005466012 systemd: Finished Apply Kernel Variables.
Oct  2 06:46:48 np0005466012 systemd: Finished Create System Users.
Oct  2 06:46:48 np0005466012 systemd-journald[308]: Journal started
Oct  2 06:46:48 np0005466012 systemd-journald[308]: Runtime Journal (/run/log/journal/ebf0c2d850454169abeccc6f6092e35d) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:46:48 np0005466012 systemd-sysusers[313]: Creating group 'users' with GID 100.
Oct  2 06:46:48 np0005466012 systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Oct  2 06:46:48 np0005466012 systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  2 06:46:48 np0005466012 systemd: Started Journal Service.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  2 06:46:48 np0005466012 systemd[1]: Starting Create Volatile Files and Directories...
Oct  2 06:46:48 np0005466012 systemd[1]: Finished Create Volatile Files and Directories.
Oct  2 06:46:48 np0005466012 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  2 06:46:48 np0005466012 systemd[1]: Finished Setup Virtual Console.
Oct  2 06:46:48 np0005466012 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting dracut cmdline hook...
Oct  2 06:46:48 np0005466012 dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Oct  2 06:46:48 np0005466012 dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  2 06:46:48 np0005466012 systemd[1]: Finished dracut cmdline hook.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting dracut pre-udev hook...
Oct  2 06:46:48 np0005466012 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  2 06:46:48 np0005466012 kernel: device-mapper: uevent: version 1.0.3
Oct  2 06:46:48 np0005466012 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  2 06:46:48 np0005466012 kernel: RPC: Registered named UNIX socket transport module.
Oct  2 06:46:48 np0005466012 kernel: RPC: Registered udp transport module.
Oct  2 06:46:48 np0005466012 kernel: RPC: Registered tcp transport module.
Oct  2 06:46:48 np0005466012 kernel: RPC: Registered tcp-with-tls transport module.
Oct  2 06:46:48 np0005466012 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  2 06:46:48 np0005466012 rpc.statd[445]: Version 2.5.4 starting
Oct  2 06:46:48 np0005466012 rpc.statd[445]: Initializing NSM state
Oct  2 06:46:48 np0005466012 rpc.idmapd[450]: Setting log level to 0
Oct  2 06:46:48 np0005466012 systemd[1]: Finished dracut pre-udev hook.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  2 06:46:48 np0005466012 systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Oct  2 06:46:48 np0005466012 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting dracut pre-trigger hook...
Oct  2 06:46:48 np0005466012 systemd[1]: Finished dracut pre-trigger hook.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting Coldplug All udev Devices...
Oct  2 06:46:48 np0005466012 systemd[1]: Created slice Slice /system/modprobe.
Oct  2 06:46:48 np0005466012 systemd[1]: Starting Load Kernel Module configfs...
Oct  2 06:46:48 np0005466012 systemd[1]: Finished Coldplug All udev Devices.
Oct  2 06:46:48 np0005466012 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  2 06:46:48 np0005466012 systemd[1]: Reached target Network.
Oct  2 06:46:48 np0005466012 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  2 06:46:48 np0005466012 systemd[1]: Starting dracut initqueue hook...
Oct  2 06:46:48 np0005466012 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:46:48 np0005466012 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:46:48 np0005466012 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  2 06:46:48 np0005466012 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  2 06:46:48 np0005466012 kernel: vda: vda1
Oct  2 06:46:49 np0005466012 kernel: scsi host0: ata_piix
Oct  2 06:46:49 np0005466012 kernel: scsi host1: ata_piix
Oct  2 06:46:49 np0005466012 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  2 06:46:49 np0005466012 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  2 06:46:49 np0005466012 systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:46:49 np0005466012 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  2 06:46:49 np0005466012 systemd[1]: Reached target Initrd Root Device.
Oct  2 06:46:49 np0005466012 systemd[1]: Mounting Kernel Configuration File System...
Oct  2 06:46:49 np0005466012 kernel: ata1: found unknown device (class 0)
Oct  2 06:46:49 np0005466012 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  2 06:46:49 np0005466012 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  2 06:46:49 np0005466012 systemd[1]: Mounted Kernel Configuration File System.
Oct  2 06:46:49 np0005466012 systemd[1]: Reached target System Initialization.
Oct  2 06:46:49 np0005466012 systemd[1]: Reached target Basic System.
Oct  2 06:46:49 np0005466012 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  2 06:46:49 np0005466012 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  2 06:46:49 np0005466012 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  2 06:46:49 np0005466012 systemd[1]: Finished dracut initqueue hook.
Oct  2 06:46:49 np0005466012 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  2 06:46:49 np0005466012 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  2 06:46:49 np0005466012 systemd[1]: Reached target Remote File Systems.
Oct  2 06:46:49 np0005466012 systemd[1]: Starting dracut pre-mount hook...
Oct  2 06:46:49 np0005466012 systemd[1]: Finished dracut pre-mount hook.
Oct  2 06:46:49 np0005466012 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  2 06:46:49 np0005466012 systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Oct  2 06:46:49 np0005466012 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  2 06:46:49 np0005466012 systemd[1]: Mounting /sysroot...
Oct  2 06:46:49 np0005466012 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  2 06:46:49 np0005466012 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  2 06:46:50 np0005466012 kernel: XFS (vda1): Ending clean mount
Oct  2 06:46:50 np0005466012 systemd[1]: Mounted /sysroot.
Oct  2 06:46:50 np0005466012 systemd[1]: Reached target Initrd Root File System.
Oct  2 06:46:50 np0005466012 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  2 06:46:50 np0005466012 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  2 06:46:50 np0005466012 systemd[1]: Reached target Initrd File Systems.
Oct  2 06:46:50 np0005466012 systemd[1]: Reached target Initrd Default Target.
Oct  2 06:46:50 np0005466012 systemd[1]: Starting dracut mount hook...
Oct  2 06:46:50 np0005466012 systemd[1]: Finished dracut mount hook.
Oct  2 06:46:50 np0005466012 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  2 06:46:50 np0005466012 rpc.idmapd[450]: exiting on signal 15
Oct  2 06:46:50 np0005466012 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  2 06:46:50 np0005466012 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Network.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Timer Units.
Oct  2 06:46:50 np0005466012 systemd[1]: dbus.socket: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Initrd Default Target.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Basic System.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Initrd Root Device.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Initrd /usr File System.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Path Units.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Remote File Systems.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Slice Units.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Socket Units.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target System Initialization.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Local File Systems.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Swaps.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut mount hook.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut pre-mount hook.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut initqueue hook.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Apply Kernel Variables.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Coldplug All udev Devices.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut pre-trigger hook.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Setup Virtual Console.
Oct  2 06:46:50 np0005466012 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Closed udev Control Socket.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Closed udev Kernel Socket.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut pre-udev hook.
Oct  2 06:46:50 np0005466012 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped dracut cmdline hook.
Oct  2 06:46:50 np0005466012 systemd[1]: Starting Cleanup udev Database...
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  2 06:46:50 np0005466012 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  2 06:46:50 np0005466012 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Stopped Create System Users.
Oct  2 06:46:50 np0005466012 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  2 06:46:50 np0005466012 systemd[1]: Finished Cleanup udev Database.
Oct  2 06:46:50 np0005466012 systemd[1]: Reached target Switch Root.
Oct  2 06:46:50 np0005466012 systemd[1]: Starting Switch Root...
Oct  2 06:46:51 np0005466012 systemd[1]: Switching root.
Oct  2 06:46:51 np0005466012 systemd-journald[308]: Journal stopped
Oct  2 06:46:53 np0005466012 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  2 06:46:53 np0005466012 kernel: audit: type=1404 audit(1759402011.369:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 06:46:53 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 06:46:53 np0005466012 kernel: audit: type=1403 audit(1759402011.582:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  2 06:46:53 np0005466012 systemd: Successfully loaded SELinux policy in 217.856ms.
Oct  2 06:46:53 np0005466012 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 84.970ms.
Oct  2 06:46:53 np0005466012 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  2 06:46:53 np0005466012 systemd: Detected virtualization kvm.
Oct  2 06:46:53 np0005466012 systemd: Detected architecture x86-64.
Oct  2 06:46:53 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 06:46:53 np0005466012 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd: Stopped Switch Root.
Oct  2 06:46:53 np0005466012 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  2 06:46:53 np0005466012 systemd: Created slice Slice /system/getty.
Oct  2 06:46:53 np0005466012 systemd: Created slice Slice /system/serial-getty.
Oct  2 06:46:53 np0005466012 systemd: Created slice Slice /system/sshd-keygen.
Oct  2 06:46:53 np0005466012 systemd: Created slice User and Session Slice.
Oct  2 06:46:53 np0005466012 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  2 06:46:53 np0005466012 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  2 06:46:53 np0005466012 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  2 06:46:53 np0005466012 systemd: Reached target Local Encrypted Volumes.
Oct  2 06:46:53 np0005466012 systemd: Stopped target Switch Root.
Oct  2 06:46:53 np0005466012 systemd: Stopped target Initrd File Systems.
Oct  2 06:46:53 np0005466012 systemd: Stopped target Initrd Root File System.
Oct  2 06:46:53 np0005466012 systemd: Reached target Local Integrity Protected Volumes.
Oct  2 06:46:53 np0005466012 systemd: Reached target Path Units.
Oct  2 06:46:53 np0005466012 systemd: Reached target rpc_pipefs.target.
Oct  2 06:46:53 np0005466012 systemd: Reached target Slice Units.
Oct  2 06:46:53 np0005466012 systemd: Reached target Swaps.
Oct  2 06:46:53 np0005466012 systemd: Reached target Local Verity Protected Volumes.
Oct  2 06:46:53 np0005466012 systemd: Listening on RPCbind Server Activation Socket.
Oct  2 06:46:53 np0005466012 systemd: Reached target RPC Port Mapper.
Oct  2 06:46:53 np0005466012 systemd: Listening on Process Core Dump Socket.
Oct  2 06:46:53 np0005466012 systemd: Listening on initctl Compatibility Named Pipe.
Oct  2 06:46:53 np0005466012 systemd: Listening on udev Control Socket.
Oct  2 06:46:53 np0005466012 systemd: Listening on udev Kernel Socket.
Oct  2 06:46:53 np0005466012 systemd: Mounting Huge Pages File System...
Oct  2 06:46:53 np0005466012 systemd: Mounting POSIX Message Queue File System...
Oct  2 06:46:53 np0005466012 systemd: Mounting Kernel Debug File System...
Oct  2 06:46:53 np0005466012 systemd: Mounting Kernel Trace File System...
Oct  2 06:46:53 np0005466012 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  2 06:46:53 np0005466012 systemd: Starting Create List of Static Device Nodes...
Oct  2 06:46:53 np0005466012 systemd: Starting Load Kernel Module configfs...
Oct  2 06:46:53 np0005466012 systemd: Starting Load Kernel Module drm...
Oct  2 06:46:53 np0005466012 systemd: Starting Load Kernel Module efi_pstore...
Oct  2 06:46:53 np0005466012 systemd: Starting Load Kernel Module fuse...
Oct  2 06:46:53 np0005466012 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  2 06:46:53 np0005466012 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd: Stopped File System Check on Root Device.
Oct  2 06:46:53 np0005466012 systemd: Stopped Journal Service.
Oct  2 06:46:53 np0005466012 kernel: fuse: init (API version 7.37)
Oct  2 06:46:53 np0005466012 systemd: Starting Journal Service...
Oct  2 06:46:53 np0005466012 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  2 06:46:53 np0005466012 systemd: Starting Generate network units from Kernel command line...
Oct  2 06:46:53 np0005466012 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:46:53 np0005466012 systemd: Starting Remount Root and Kernel File Systems...
Oct  2 06:46:53 np0005466012 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  2 06:46:53 np0005466012 systemd: Starting Apply Kernel Variables...
Oct  2 06:46:53 np0005466012 systemd: Starting Coldplug All udev Devices...
Oct  2 06:46:53 np0005466012 systemd: Mounted Huge Pages File System.
Oct  2 06:46:53 np0005466012 systemd: Mounted POSIX Message Queue File System.
Oct  2 06:46:53 np0005466012 systemd-journald[683]: Journal started
Oct  2 06:46:53 np0005466012 systemd-journald[683]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:46:53 np0005466012 systemd[1]: Queued start job for default target Multi-User System.
Oct  2 06:46:53 np0005466012 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd: Started Journal Service.
Oct  2 06:46:53 np0005466012 systemd[1]: Mounted Kernel Debug File System.
Oct  2 06:46:53 np0005466012 kernel: ACPI: bus type drm_connector registered
Oct  2 06:46:53 np0005466012 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  2 06:46:53 np0005466012 systemd[1]: Mounted Kernel Trace File System.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Create List of Static Device Nodes.
Oct  2 06:46:53 np0005466012 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:46:53 np0005466012 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Load Kernel Module drm.
Oct  2 06:46:53 np0005466012 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  2 06:46:53 np0005466012 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Load Kernel Module fuse.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Generate network units from Kernel command line.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  2 06:46:53 np0005466012 systemd[1]: Mounting FUSE Control File System...
Oct  2 06:46:53 np0005466012 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  2 06:46:53 np0005466012 systemd[1]: Starting Rebuild Hardware Database...
Oct  2 06:46:53 np0005466012 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  2 06:46:53 np0005466012 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  2 06:46:53 np0005466012 systemd[1]: Starting Load/Save OS Random Seed...
Oct  2 06:46:53 np0005466012 systemd[1]: Starting Create System Users...
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Apply Kernel Variables.
Oct  2 06:46:53 np0005466012 systemd[1]: Mounted FUSE Control File System.
Oct  2 06:46:53 np0005466012 systemd-journald[683]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  2 06:46:53 np0005466012 systemd-journald[683]: Received client request to flush runtime journal.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Coldplug All udev Devices.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Create System Users.
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Load/Save OS Random Seed.
Oct  2 06:46:53 np0005466012 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  2 06:46:53 np0005466012 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  2 06:46:53 np0005466012 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  2 06:46:53 np0005466012 systemd[1]: Reached target Preparation for Local File Systems.
Oct  2 06:46:53 np0005466012 systemd[1]: Reached target Local File Systems.
Oct  2 06:46:53 np0005466012 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  2 06:46:53 np0005466012 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  2 06:46:53 np0005466012 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  2 06:46:53 np0005466012 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Automatic Boot Loader Update...
Oct  2 06:46:54 np0005466012 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Create Volatile Files and Directories...
Oct  2 06:46:54 np0005466012 bootctl[701]: Couldn't find EFI system partition, skipping.
Oct  2 06:46:54 np0005466012 systemd[1]: Finished Automatic Boot Loader Update.
Oct  2 06:46:54 np0005466012 systemd[1]: Finished Create Volatile Files and Directories.
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Security Auditing Service...
Oct  2 06:46:54 np0005466012 systemd[1]: Starting RPC Bind...
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Rebuild Journal Catalog...
Oct  2 06:46:54 np0005466012 systemd[1]: Finished Rebuild Journal Catalog.
Oct  2 06:46:54 np0005466012 auditd[707]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  2 06:46:54 np0005466012 auditd[707]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  2 06:46:54 np0005466012 systemd[1]: Started RPC Bind.
Oct  2 06:46:54 np0005466012 augenrules[712]: /sbin/augenrules: No change
Oct  2 06:46:54 np0005466012 augenrules[727]: No rules
Oct  2 06:46:54 np0005466012 augenrules[727]: enabled 1
Oct  2 06:46:54 np0005466012 augenrules[727]: failure 1
Oct  2 06:46:54 np0005466012 augenrules[727]: pid 707
Oct  2 06:46:54 np0005466012 augenrules[727]: rate_limit 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_limit 8192
Oct  2 06:46:54 np0005466012 augenrules[727]: lost 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog 3
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_wait_time 60000
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_wait_time_actual 0
Oct  2 06:46:54 np0005466012 augenrules[727]: enabled 1
Oct  2 06:46:54 np0005466012 augenrules[727]: failure 1
Oct  2 06:46:54 np0005466012 augenrules[727]: pid 707
Oct  2 06:46:54 np0005466012 augenrules[727]: rate_limit 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_limit 8192
Oct  2 06:46:54 np0005466012 augenrules[727]: lost 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_wait_time 60000
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_wait_time_actual 0
Oct  2 06:46:54 np0005466012 augenrules[727]: enabled 1
Oct  2 06:46:54 np0005466012 augenrules[727]: failure 1
Oct  2 06:46:54 np0005466012 augenrules[727]: pid 707
Oct  2 06:46:54 np0005466012 augenrules[727]: rate_limit 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_limit 8192
Oct  2 06:46:54 np0005466012 augenrules[727]: lost 0
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog 3
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_wait_time 60000
Oct  2 06:46:54 np0005466012 augenrules[727]: backlog_wait_time_actual 0
Oct  2 06:46:54 np0005466012 systemd[1]: Started Security Auditing Service.
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  2 06:46:54 np0005466012 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  2 06:46:54 np0005466012 systemd[1]: Finished Rebuild Hardware Database.
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  2 06:46:54 np0005466012 systemd-udevd[735]: Using default interface naming scheme 'rhel-9.0'.
Oct  2 06:46:54 np0005466012 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  2 06:46:54 np0005466012 systemd[1]: Starting Load Kernel Module configfs...
Oct  2 06:46:54 np0005466012 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  2 06:46:54 np0005466012 systemd[1]: Finished Load Kernel Module configfs.
Oct  2 06:46:54 np0005466012 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  2 06:46:54 np0005466012 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  2 06:46:54 np0005466012 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  2 06:46:54 np0005466012 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  2 06:46:55 np0005466012 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  2 06:46:55 np0005466012 systemd-udevd[750]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 06:46:55 np0005466012 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  2 06:46:55 np0005466012 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  2 06:46:55 np0005466012 kernel: Console: switching to colour dummy device 80x25
Oct  2 06:46:55 np0005466012 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  2 06:46:55 np0005466012 kernel: [drm] features: -context_init
Oct  2 06:46:55 np0005466012 kernel: [drm] number of scanouts: 1
Oct  2 06:46:55 np0005466012 kernel: [drm] number of cap sets: 0
Oct  2 06:46:55 np0005466012 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  2 06:46:55 np0005466012 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  2 06:46:55 np0005466012 kernel: Console: switching to colour frame buffer device 128x48
Oct  2 06:46:55 np0005466012 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  2 06:46:55 np0005466012 kernel: kvm_amd: TSC scaling supported
Oct  2 06:46:55 np0005466012 kernel: kvm_amd: Nested Virtualization enabled
Oct  2 06:46:55 np0005466012 kernel: kvm_amd: Nested Paging enabled
Oct  2 06:46:55 np0005466012 kernel: kvm_amd: LBR virtualization supported
Oct  2 06:46:55 np0005466012 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  2 06:46:55 np0005466012 systemd[1]: Starting Update is Completed...
Oct  2 06:46:55 np0005466012 systemd[1]: Finished Update is Completed.
Oct  2 06:46:55 np0005466012 systemd[1]: Reached target System Initialization.
Oct  2 06:46:55 np0005466012 systemd[1]: Started dnf makecache --timer.
Oct  2 06:46:55 np0005466012 systemd[1]: Started Daily rotation of log files.
Oct  2 06:46:55 np0005466012 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  2 06:46:55 np0005466012 systemd[1]: Reached target Timer Units.
Oct  2 06:46:55 np0005466012 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  2 06:46:55 np0005466012 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  2 06:46:55 np0005466012 systemd[1]: Reached target Socket Units.
Oct  2 06:46:55 np0005466012 systemd[1]: Starting D-Bus System Message Bus...
Oct  2 06:46:55 np0005466012 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:46:55 np0005466012 systemd[1]: Started D-Bus System Message Bus.
Oct  2 06:46:55 np0005466012 systemd[1]: Reached target Basic System.
Oct  2 06:46:55 np0005466012 dbus-broker-lau[817]: Ready
Oct  2 06:46:55 np0005466012 systemd[1]: Starting NTP client/server...
Oct  2 06:46:55 np0005466012 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  2 06:46:55 np0005466012 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  2 06:46:55 np0005466012 systemd[1]: Starting IPv4 firewall with iptables...
Oct  2 06:46:55 np0005466012 systemd[1]: Started irqbalance daemon.
Oct  2 06:46:55 np0005466012 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  2 06:46:55 np0005466012 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:46:55 np0005466012 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:46:55 np0005466012 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 06:46:55 np0005466012 systemd[1]: Reached target sshd-keygen.target.
Oct  2 06:46:55 np0005466012 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  2 06:46:55 np0005466012 systemd[1]: Reached target User and Group Name Lookups.
Oct  2 06:46:55 np0005466012 systemd[1]: Starting User Login Management...
Oct  2 06:46:55 np0005466012 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  2 06:46:56 np0005466012 chronyd[836]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  2 06:46:56 np0005466012 systemd-logind[827]: New seat seat0.
Oct  2 06:46:56 np0005466012 chronyd[836]: Loaded 0 symmetric keys
Oct  2 06:46:56 np0005466012 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  2 06:46:56 np0005466012 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  2 06:46:56 np0005466012 systemd[1]: Started User Login Management.
Oct  2 06:46:56 np0005466012 chronyd[836]: Using right/UTC timezone to obtain leap second data
Oct  2 06:46:56 np0005466012 chronyd[836]: Loaded seccomp filter (level 2)
Oct  2 06:46:56 np0005466012 systemd[1]: Started NTP client/server.
Oct  2 06:46:56 np0005466012 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  2 06:46:56 np0005466012 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  2 06:46:56 np0005466012 iptables.init[822]: iptables: Applying firewall rules: [  OK  ]
Oct  2 06:46:56 np0005466012 systemd[1]: Finished IPv4 firewall with iptables.
Oct  2 06:46:57 np0005466012 cloud-init[845]: Cloud-init v. 24.4-7.el9 running 'init-local' at Thu, 02 Oct 2025 10:46:57 +0000. Up 11.35 seconds.
Oct  2 06:46:58 np0005466012 systemd[1]: run-cloud\x2dinit-tmp-tmpanebkful.mount: Deactivated successfully.
Oct  2 06:46:58 np0005466012 systemd[1]: Starting Hostname Service...
Oct  2 06:46:58 np0005466012 systemd[1]: Started Hostname Service.
Oct  2 06:46:58 np0005466012 systemd-hostnamed[861]: Hostname set to <np0005466012.novalocal> (static)
Oct  2 06:46:58 np0005466012 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  2 06:46:58 np0005466012 systemd[1]: Reached target Preparation for Network.
Oct  2 06:46:58 np0005466012 systemd[1]: Starting Network Manager...
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.4864] NetworkManager (version 1.54.1-1.el9) is starting... (boot:4de76784-23e0-4a0b-a470-2209f5c6de9a)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.4871] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5078] manager[0x55d46dfb6080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5149] hostname: hostname: using hostnamed
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5150] hostname: static hostname changed from (none) to "np0005466012.novalocal"
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5158] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5337] manager[0x55d46dfb6080]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5338] manager[0x55d46dfb6080]: rfkill: WWAN hardware radio set enabled
Oct  2 06:46:58 np0005466012 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5471] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5471] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5473] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5474] manager: Networking is enabled by state file
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5478] settings: Loaded settings plugin: keyfile (internal)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5522] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5560] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5593] dhcp: init: Using DHCP client 'internal'
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5598] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5622] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5649] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5662] device (lo): Activation: starting connection 'lo' (a2f56833-ce60-4e44-b803-7556fb2bfbcb)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5678] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5684] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 06:46:58 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 06:46:58 np0005466012 systemd[1]: Started Network Manager.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5779] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5786] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 06:46:58 np0005466012 systemd[1]: Reached target Network.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5789] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5814] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5820] device (eth0): carrier: link connected
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5826] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5838] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  2 06:46:58 np0005466012 systemd[1]: Starting Network Manager Wait Online...
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5871] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5880] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5882] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5886] manager: NetworkManager state is now CONNECTING
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5890] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 06:46:58 np0005466012 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5924] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 06:46:58 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5930] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5992] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.5995] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6008] device (lo): Activation: successful, device activated.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6055] dhcp4 (eth0): state changed new lease, address=38.102.83.181
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6071] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6153] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6201] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6204] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6208] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6213] device (eth0): Activation: successful, device activated.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6223] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 06:46:58 np0005466012 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  2 06:46:58 np0005466012 NetworkManager[865]: <info>  [1759402018.6241] manager: startup complete
Oct  2 06:46:58 np0005466012 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  2 06:46:58 np0005466012 systemd[1]: Reached target NFS client services.
Oct  2 06:46:58 np0005466012 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  2 06:46:58 np0005466012 systemd[1]: Reached target Remote File Systems.
Oct  2 06:46:58 np0005466012 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  2 06:46:58 np0005466012 systemd[1]: Finished Network Manager Wait Online.
Oct  2 06:46:58 np0005466012 systemd[1]: Starting Cloud-init: Network Stage...
Oct  2 06:46:58 np0005466012 cloud-init[929]: Cloud-init v. 24.4-7.el9 running 'init' at Thu, 02 Oct 2025 10:46:58 +0000. Up 12.62 seconds.
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |  eth0  | True |        38.102.83.181         | 255.255.255.0 | global | fa:16:3e:01:ba:27 |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |  eth0  | True | fe80::f816:3eff:fe01:ba27/64 |       .       |  link  | fa:16:3e:01:ba:27 |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct  2 06:46:59 np0005466012 cloud-init[929]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  2 06:47:00 np0005466012 cloud-init[929]: Generating public/private rsa key pair.
Oct  2 06:47:00 np0005466012 cloud-init[929]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  2 06:47:00 np0005466012 cloud-init[929]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  2 06:47:00 np0005466012 cloud-init[929]: The key fingerprint is:
Oct  2 06:47:00 np0005466012 cloud-init[929]: SHA256:xmR5Qlz2Q1cQhMit//8PX6uVNaMgVkBG09GdpmkiVkY root@np0005466012.novalocal
Oct  2 06:47:00 np0005466012 cloud-init[929]: The key's randomart image is:
Oct  2 06:47:00 np0005466012 cloud-init[929]: +---[RSA 3072]----+
Oct  2 06:47:00 np0005466012 cloud-init[929]: |       .+O=E+o*+o|
Oct  2 06:47:00 np0005466012 cloud-init[929]: |       .oo=*oo + |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |        = +oo +  |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |       + =o. =   |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |        So.oo  o.|
Oct  2 06:47:00 np0005466012 cloud-init[929]: |       .. . o . =|
Oct  2 06:47:00 np0005466012 cloud-init[929]: |             o.o.|
Oct  2 06:47:00 np0005466012 cloud-init[929]: |              ooo|
Oct  2 06:47:00 np0005466012 cloud-init[929]: |             ..oB|
Oct  2 06:47:00 np0005466012 cloud-init[929]: +----[SHA256]-----+
Oct  2 06:47:00 np0005466012 cloud-init[929]: Generating public/private ecdsa key pair.
Oct  2 06:47:00 np0005466012 cloud-init[929]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  2 06:47:00 np0005466012 cloud-init[929]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  2 06:47:00 np0005466012 cloud-init[929]: The key fingerprint is:
Oct  2 06:47:00 np0005466012 cloud-init[929]: SHA256:gKk8qOXb5Ny8KLGFhQY27lwyoMA5Jp1fmChSDeIsOZM root@np0005466012.novalocal
Oct  2 06:47:00 np0005466012 cloud-init[929]: The key's randomart image is:
Oct  2 06:47:00 np0005466012 cloud-init[929]: +---[ECDSA 256]---+
Oct  2 06:47:00 np0005466012 cloud-init[929]: |oo+= o           |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |X@+ +o.          |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |E=ooo..          |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |oOooo  .         |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |+.Oo    S        |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |.=o..            |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |. .+.            |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |  o* +           |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |  ..= +.         |
Oct  2 06:47:00 np0005466012 cloud-init[929]: +----[SHA256]-----+
Oct  2 06:47:00 np0005466012 cloud-init[929]: Generating public/private ed25519 key pair.
Oct  2 06:47:00 np0005466012 cloud-init[929]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  2 06:47:00 np0005466012 cloud-init[929]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  2 06:47:00 np0005466012 cloud-init[929]: The key fingerprint is:
Oct  2 06:47:00 np0005466012 cloud-init[929]: SHA256:t/oT7/KygfM3yk4iYCRzNWT//avOKwiFM7pP3o8vz0Y root@np0005466012.novalocal
Oct  2 06:47:00 np0005466012 cloud-init[929]: The key's randomart image is:
Oct  2 06:47:00 np0005466012 cloud-init[929]: +--[ED25519 256]--+
Oct  2 06:47:00 np0005466012 cloud-init[929]: |     .=          |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |     o o         |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |  o o  ..        |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |   =  + .. .     |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |    o. +S o .    |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |   ....  oE. .   |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |     .o.+o+o  .  |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |    .o ooX*o+  . |
Oct  2 06:47:00 np0005466012 cloud-init[929]: |     .o +O%%B+.  |
Oct  2 06:47:00 np0005466012 cloud-init[929]: +----[SHA256]-----+
Oct  2 06:47:00 np0005466012 systemd[1]: Finished Cloud-init: Network Stage.
Oct  2 06:47:00 np0005466012 systemd[1]: Reached target Cloud-config availability.
Oct  2 06:47:00 np0005466012 systemd[1]: Reached target Network is Online.
Oct  2 06:47:00 np0005466012 systemd[1]: Starting Cloud-init: Config Stage...
Oct  2 06:47:00 np0005466012 systemd[1]: Starting Notify NFS peers of a restart...
Oct  2 06:47:00 np0005466012 systemd[1]: Starting System Logging Service...
Oct  2 06:47:00 np0005466012 systemd[1]: Starting OpenSSH server daemon...
Oct  2 06:47:00 np0005466012 sm-notify[1010]: Version 2.5.4 starting
Oct  2 06:47:00 np0005466012 systemd[1]: Starting Permit User Sessions...
Oct  2 06:47:00 np0005466012 systemd[1]: Started Notify NFS peers of a restart.
Oct  2 06:47:00 np0005466012 systemd[1]: Started OpenSSH server daemon.
Oct  2 06:47:00 np0005466012 systemd[1]: Finished Permit User Sessions.
Oct  2 06:47:00 np0005466012 systemd[1]: Started Command Scheduler.
Oct  2 06:47:00 np0005466012 systemd[1]: Started Getty on tty1.
Oct  2 06:47:00 np0005466012 systemd[1]: Started Serial Getty on ttyS0.
Oct  2 06:47:00 np0005466012 systemd[1]: Reached target Login Prompts.
Oct  2 06:47:01 np0005466012 rsyslogd[1011]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1011" x-info="https://www.rsyslog.com"] start
Oct  2 06:47:01 np0005466012 rsyslogd[1011]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  2 06:47:01 np0005466012 systemd[1]: Started System Logging Service.
Oct  2 06:47:01 np0005466012 systemd[1]: Reached target Multi-User System.
Oct  2 06:47:01 np0005466012 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  2 06:47:01 np0005466012 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  2 06:47:01 np0005466012 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  2 06:47:01 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 06:47:01 np0005466012 cloud-init[1023]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Thu, 02 Oct 2025 10:47:01 +0000. Up 14.86 seconds.
Oct  2 06:47:01 np0005466012 systemd[1]: Finished Cloud-init: Config Stage.
Oct  2 06:47:01 np0005466012 systemd[1]: Starting Cloud-init: Final Stage...
Oct  2 06:47:01 np0005466012 cloud-init[1027]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Thu, 02 Oct 2025 10:47:01 +0000. Up 15.27 seconds.
Oct  2 06:47:01 np0005466012 cloud-init[1029]: #############################################################
Oct  2 06:47:01 np0005466012 cloud-init[1030]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  2 06:47:01 np0005466012 cloud-init[1032]: 256 SHA256:gKk8qOXb5Ny8KLGFhQY27lwyoMA5Jp1fmChSDeIsOZM root@np0005466012.novalocal (ECDSA)
Oct  2 06:47:01 np0005466012 cloud-init[1034]: 256 SHA256:t/oT7/KygfM3yk4iYCRzNWT//avOKwiFM7pP3o8vz0Y root@np0005466012.novalocal (ED25519)
Oct  2 06:47:01 np0005466012 cloud-init[1036]: 3072 SHA256:xmR5Qlz2Q1cQhMit//8PX6uVNaMgVkBG09GdpmkiVkY root@np0005466012.novalocal (RSA)
Oct  2 06:47:01 np0005466012 cloud-init[1037]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  2 06:47:01 np0005466012 cloud-init[1038]: #############################################################
Oct  2 06:47:01 np0005466012 cloud-init[1027]: Cloud-init v. 24.4-7.el9 finished at Thu, 02 Oct 2025 10:47:01 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 15.51 seconds
Oct  2 06:47:01 np0005466012 systemd[1]: Finished Cloud-init: Final Stage.
Oct  2 06:47:01 np0005466012 systemd[1]: Reached target Cloud-init target.
Oct  2 06:47:01 np0005466012 systemd[1]: Startup finished in 1.540s (kernel) + 3.484s (initrd) + 10.558s (userspace) = 15.583s.
Oct  2 06:47:04 np0005466012 chronyd[836]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Oct  2 06:47:04 np0005466012 chronyd[836]: System clock TAI offset set to 37 seconds
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 35 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 35 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 33 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 33 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 31 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 28 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 34 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 34 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 32 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 30 affinity is now unmanaged
Oct  2 06:47:06 np0005466012 irqbalance[823]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  2 06:47:06 np0005466012 irqbalance[823]: IRQ 29 affinity is now unmanaged
Oct  2 06:47:08 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 06:47:28 np0005466012 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 06:51:24 np0005466012 chronyd[836]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Oct  2 07:01:41 np0005466012 systemd[1]: Created slice User Slice of UID 1000.
Oct  2 07:01:41 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  2 07:01:41 np0005466012 systemd-logind[827]: New session 1 of user zuul.
Oct  2 07:01:41 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  2 07:01:41 np0005466012 systemd[1]: Starting User Manager for UID 1000...
Oct  2 07:01:41 np0005466012 systemd[1095]: Queued start job for default target Main User Target.
Oct  2 07:01:41 np0005466012 systemd[1095]: Created slice User Application Slice.
Oct  2 07:01:41 np0005466012 systemd[1095]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 07:01:41 np0005466012 systemd[1095]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:01:41 np0005466012 systemd[1095]: Reached target Paths.
Oct  2 07:01:41 np0005466012 systemd[1095]: Reached target Timers.
Oct  2 07:01:41 np0005466012 systemd[1095]: Starting D-Bus User Message Bus Socket...
Oct  2 07:01:41 np0005466012 systemd[1095]: Starting Create User's Volatile Files and Directories...
Oct  2 07:01:41 np0005466012 systemd[1095]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:01:41 np0005466012 systemd[1095]: Reached target Sockets.
Oct  2 07:01:41 np0005466012 systemd[1095]: Finished Create User's Volatile Files and Directories.
Oct  2 07:01:41 np0005466012 systemd[1095]: Reached target Basic System.
Oct  2 07:01:41 np0005466012 systemd[1095]: Reached target Main User Target.
Oct  2 07:01:41 np0005466012 systemd[1095]: Startup finished in 197ms.
Oct  2 07:01:41 np0005466012 systemd[1]: Started User Manager for UID 1000.
Oct  2 07:01:42 np0005466012 systemd[1]: Started Session 1 of User zuul.
Oct  2 07:01:42 np0005466012 python3[1179]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:01:46 np0005466012 python3[1207]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:01:46 np0005466012 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  2 07:01:46 np0005466012 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  2 07:01:46 np0005466012 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  2 07:01:46 np0005466012 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  2 07:01:54 np0005466012 python3[1267]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:01:55 np0005466012 python3[1307]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  2 07:01:57 np0005466012 python3[1333]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQCZ3vFv33zuU9QR4Erz5ZRISFa/oPvha0xwrBdyzVa18ydYUaCm/1GZP9yUXeHFz7iqX2LFQYNsjkqZJz1Uu67Idku6xgJC7Fx6g9BMv0MT1Zlak1CqYHg2DEyLxPerFs9LKBlOaZV+zN8b4kdG8Ww5E2kG2A7Ui3Cuzht/VP01bi+s4UjtwKH6CZ6X56ylQhY7z0Z+hPDBDFz1Oy2SYkyvdrztTs4eWaoebh/cWCdWX0V2djhSx6cc/r+wVBz3Aibc6gZzEn+Gpq8ffdM/6w/oD9Iqy6ijpCtmVA92FGjAJvr33J1xKd5XxDh4pvKaqFm7hjEeL+KJ1Z1ABjWrwV0uQNNHxit/J8k2+UdRsH+ZYoO3rrg4X8rEHQr981ffbmUPm16g5UJE1TZx20ZMh8oTkA5hXg5ydzjiktL9jGvgn+fSI1iCi1fdR/jUZ3xfQN6Q23wnG7lApoHjP4JXM75nxGNc0elGo9oGrDGWSVEwTOqp4qQPIuFtq+hNm1uTU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:01:57 np0005466012 python3[1357]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:01:58 np0005466012 python3[1456]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:01:58 np0005466012 python3[1527]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402918.0564215-252-23026100574154/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=e98c7d2ea8ba4729beeb0aae1d087b01_id_rsa follow=False checksum=84e221810e16da2c918261cb937e6458833c76e7 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:01:59 np0005466012 python3[1650]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:01:59 np0005466012 python3[1721]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759402919.0526588-307-14698884006487/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=e98c7d2ea8ba4729beeb0aae1d087b01_id_rsa.pub follow=False checksum=4dfd599c92ceccedc02682f946e69efec3324503 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:01 np0005466012 python3[1769]: ansible-ping Invoked with data=pong
Oct  2 07:02:02 np0005466012 python3[1793]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:02:04 np0005466012 python3[1851]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  2 07:02:05 np0005466012 python3[1883]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:05 np0005466012 python3[1907]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466012 python3[1931]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466012 python3[1955]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466012 python3[1979]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:06 np0005466012 python3[2003]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:08 np0005466012 python3[2029]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:09 np0005466012 python3[2107]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:09 np0005466012 python3[2180]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402929.075876-33-17346912711741/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:10 np0005466012 python3[2228]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:10 np0005466012 python3[2252]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:11 np0005466012 python3[2276]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:11 np0005466012 python3[2300]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:11 np0005466012 python3[2324]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466012 python3[2348]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466012 python3[2372]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466012 python3[2396]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:12 np0005466012 python3[2422]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466012 python3[2446]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466012 python3[2470]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466012 python3[2494]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:13 np0005466012 python3[2518]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466012 python3[2542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466012 python3[2566]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:14 np0005466012 python3[2590]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466012 python3[2614]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466012 python3[2638]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466012 python3[2662]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:15 np0005466012 python3[2686]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:16 np0005466012 python3[2710]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:16 np0005466012 python3[2734]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:16 np0005466012 python3[2758]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466012 python3[2782]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466012 python3[2806]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:17 np0005466012 python3[2830]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:02:20 np0005466012 python3[2856]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 07:02:20 np0005466012 systemd[1]: Starting Time & Date Service...
Oct  2 07:02:20 np0005466012 systemd[1]: Started Time & Date Service.
Oct  2 07:02:20 np0005466012 systemd-timedated[2858]: Changed time zone to 'UTC' (UTC).
Oct  2 07:02:22 np0005466012 python3[2891]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:22 np0005466012 python3[2967]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:23 np0005466012 python3[3038]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759402942.4999545-252-211925508154275/source _original_basename=tmpzueq_qv_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:23 np0005466012 python3[3138]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:23 np0005466012 python3[3209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759402943.3926494-302-188173713497170/source _original_basename=tmp8ngtgl54 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:24 np0005466012 python3[3311]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:25 np0005466012 python3[3384]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759402944.5851865-382-183305868029055/source _original_basename=tmpg1vwgqfu follow=False checksum=342f501e01c1098669fc1f1874ec75e7ad7dd27a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:25 np0005466012 python3[3432]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:02:26 np0005466012 python3[3458]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:02:26 np0005466012 python3[3538]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:02:27 np0005466012 python3[3611]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759402946.3136725-452-203445698372036/source _original_basename=tmprv9oi2ld follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:27 np0005466012 python3[3662]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-893c-5f89-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:02:28 np0005466012 python3[3690]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-893c-5f89-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  2 07:02:29 np0005466012 python3[3719]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:47 np0005466012 python3[3745]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:02:50 np0005466012 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 07:03:47 np0005466012 systemd-logind[827]: Session 1 logged out. Waiting for processes to exit.
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  2 07:03:52 np0005466012 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  2 07:03:52 np0005466012 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9221] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:03:52 np0005466012 systemd-udevd[3751]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9436] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9470] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9476] device (eth1): carrier: link connected
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9478] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9484] policy: auto-activating connection 'Wired connection 1' (04885c6e-a4ca-3823-855d-920bcf370929)
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9488] device (eth1): Activation: starting connection 'Wired connection 1' (04885c6e-a4ca-3823-855d-920bcf370929)
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9489] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9492] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9496] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:03:52 np0005466012 NetworkManager[865]: <info>  [1759403032.9500] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:03:52 np0005466012 systemd[1095]: Starting Mark boot as successful...
Oct  2 07:03:52 np0005466012 systemd[1095]: Finished Mark boot as successful.
Oct  2 07:03:53 np0005466012 systemd-logind[827]: New session 3 of user zuul.
Oct  2 07:03:53 np0005466012 systemd[1]: Started Session 3 of User zuul.
Oct  2 07:03:53 np0005466012 python3[3782]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-d220-2459-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:04:00 np0005466012 python3[3862]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:04:01 np0005466012 python3[3935]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759403040.5951717-155-182477537066044/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=b469e741a4654bc6fb3209635858efec7c863186 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:04:01 np0005466012 python3[3985]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:04:01 np0005466012 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  2 07:04:01 np0005466012 systemd[1]: Stopped Network Manager Wait Online.
Oct  2 07:04:01 np0005466012 systemd[1]: Stopping Network Manager Wait Online...
Oct  2 07:04:01 np0005466012 systemd[1]: Stopping Network Manager...
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8884] caught SIGTERM, shutting down normally.
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8896] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8896] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8896] dhcp4 (eth0): state changed no lease
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8898] manager: NetworkManager state is now CONNECTING
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8998] dhcp4 (eth1): canceled DHCP transaction
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.8998] dhcp4 (eth1): state changed no lease
Oct  2 07:04:01 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:04:01 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:04:01 np0005466012 NetworkManager[865]: <info>  [1759403041.9566] exiting (success)
Oct  2 07:04:01 np0005466012 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  2 07:04:01 np0005466012 systemd[1]: Stopped Network Manager.
Oct  2 07:04:01 np0005466012 systemd[1]: NetworkManager.service: Consumed 5.656s CPU time, 10.1M memory peak.
Oct  2 07:04:01 np0005466012 systemd[1]: Starting Network Manager...
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0119] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:4de76784-23e0-4a0b-a470-2209f5c6de9a)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0120] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0176] manager[0x558ddc475070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 07:04:02 np0005466012 systemd[1]: Starting Hostname Service...
Oct  2 07:04:02 np0005466012 systemd[1]: Started Hostname Service.
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0962] hostname: hostname: using hostnamed
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0963] hostname: static hostname changed from (none) to "np0005466012.novalocal"
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0968] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0973] manager[0x558ddc475070]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.0973] manager[0x558ddc475070]: rfkill: WWAN hardware radio set enabled
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1000] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1000] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1001] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1001] manager: Networking is enabled by state file
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1004] settings: Loaded settings plugin: keyfile (internal)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1008] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1030] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1041] dhcp: init: Using DHCP client 'internal'
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1043] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1046] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1051] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1057] device (lo): Activation: starting connection 'lo' (a2f56833-ce60-4e44-b803-7556fb2bfbcb)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1062] device (eth0): carrier: link connected
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1065] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1069] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1070] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1075] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1080] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1084] device (eth1): carrier: link connected
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1087] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1091] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (04885c6e-a4ca-3823-855d-920bcf370929) (indicated)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1092] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1096] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1100] device (eth1): Activation: starting connection 'Wired connection 1' (04885c6e-a4ca-3823-855d-920bcf370929)
Oct  2 07:04:02 np0005466012 systemd[1]: Started Network Manager.
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1105] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1109] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1110] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1112] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1116] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1121] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1129] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1134] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1143] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1149] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1156] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1163] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1167] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1181] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1188] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1194] device (lo): Activation: successful, device activated.
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1214] dhcp4 (eth0): state changed new lease, address=38.102.83.181
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1225] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 07:04:02 np0005466012 systemd[1]: Starting Network Manager Wait Online...
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.1866] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.2501] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.2503] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.2510] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.2516] device (eth0): Activation: successful, device activated.
Oct  2 07:04:02 np0005466012 NetworkManager[4002]: <info>  [1759403042.2521] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 07:04:02 np0005466012 python3[4070]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-d220-2459-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:04:12 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:04:32 np0005466012 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3248] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:04:47 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:04:47 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3544] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3546] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3551] device (eth1): Activation: successful, device activated.
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3557] manager: startup complete
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3560] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <warn>  [1759403087.3564] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3569] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 systemd[1]: Finished Network Manager Wait Online.
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3657] dhcp4 (eth1): canceled DHCP transaction
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3658] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3658] dhcp4 (eth1): state changed no lease
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3673] policy: auto-activating connection 'ci-private-network' (7858c0cd-f79e-526d-9711-eb64af0637f1)
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3678] device (eth1): Activation: starting connection 'ci-private-network' (7858c0cd-f79e-526d-9711-eb64af0637f1)
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3678] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3681] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3689] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3697] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3745] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3747] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:04:47 np0005466012 NetworkManager[4002]: <info>  [1759403087.3753] device (eth1): Activation: successful, device activated.
Oct  2 07:04:57 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:05:02 np0005466012 systemd[1]: session-3.scope: Deactivated successfully.
Oct  2 07:05:02 np0005466012 systemd[1]: session-3.scope: Consumed 1.632s CPU time.
Oct  2 07:05:02 np0005466012 systemd-logind[827]: Session 3 logged out. Waiting for processes to exit.
Oct  2 07:05:02 np0005466012 systemd-logind[827]: Removed session 3.
Oct  2 07:05:34 np0005466012 systemd-logind[827]: New session 4 of user zuul.
Oct  2 07:05:34 np0005466012 systemd[1]: Started Session 4 of User zuul.
Oct  2 07:05:34 np0005466012 python3[4180]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:05:34 np0005466012 python3[4253]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759403134.1994634-365-200981118861885/source _original_basename=tmpjujbn3n3 follow=False checksum=7ead7cbef44571b5903e56d225b6c0c65e6bdcb6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:05:38 np0005466012 systemd[1]: session-4.scope: Deactivated successfully.
Oct  2 07:05:38 np0005466012 systemd-logind[827]: Session 4 logged out. Waiting for processes to exit.
Oct  2 07:05:38 np0005466012 systemd-logind[827]: Removed session 4.
Oct  2 07:07:34 np0005466012 systemd[1095]: Created slice User Background Tasks Slice.
Oct  2 07:07:34 np0005466012 systemd[1095]: Starting Cleanup of User's Temporary Files and Directories...
Oct  2 07:07:34 np0005466012 systemd[1095]: Finished Cleanup of User's Temporary Files and Directories.
Oct  2 07:11:24 np0005466012 systemd-logind[827]: New session 5 of user zuul.
Oct  2 07:11:24 np0005466012 systemd[1]: Started Session 5 of User zuul.
Oct  2 07:11:24 np0005466012 python3[4315]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-67bc-3be9-000000000ca4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:25 np0005466012 python3[4344]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:25 np0005466012 python3[4370]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:25 np0005466012 python3[4396]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:26 np0005466012 python3[4422]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:26 np0005466012 python3[4448]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:11:26 np0005466012 python3[4448]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  2 07:11:27 np0005466012 python3[4474]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:11:27 np0005466012 systemd[1]: Reloading.
Oct  2 07:11:27 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:11:27 np0005466012 systemd[1]: Starting dnf makecache...
Oct  2 07:11:28 np0005466012 dnf[4505]: Failed determining last makecache time.
Oct  2 07:11:28 np0005466012 dnf[4505]: CentOS Stream 9 - BaseOS                         42 kB/s | 6.7 kB     00:00
Oct  2 07:11:28 np0005466012 dnf[4505]: CentOS Stream 9 - AppStream                      55 kB/s | 6.8 kB     00:00
Oct  2 07:11:29 np0005466012 python3[4536]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  2 07:11:29 np0005466012 dnf[4505]: CentOS Stream 9 - CRB                            64 kB/s | 6.6 kB     00:00
Oct  2 07:11:29 np0005466012 python3[4564]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:30 np0005466012 dnf[4505]: CentOS Stream 9 - Extras packages                51 kB/s | 8.0 kB     00:00
Oct  2 07:11:30 np0005466012 python3[4593]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:30 np0005466012 dnf[4505]: Metadata cache created.
Oct  2 07:11:30 np0005466012 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  2 07:11:30 np0005466012 systemd[1]: Finished dnf makecache.
Oct  2 07:11:30 np0005466012 python3[4621]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:30 np0005466012 python3[4650]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:31 np0005466012 python3[4677]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-67bc-3be9-000000000caa-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:11:31 np0005466012 python3[4707]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:11:34 np0005466012 systemd[1]: session-5.scope: Deactivated successfully.
Oct  2 07:11:34 np0005466012 systemd[1]: session-5.scope: Consumed 3.483s CPU time.
Oct  2 07:11:34 np0005466012 systemd-logind[827]: Session 5 logged out. Waiting for processes to exit.
Oct  2 07:11:34 np0005466012 systemd-logind[827]: Removed session 5.
Oct  2 07:11:36 np0005466012 systemd-logind[827]: New session 6 of user zuul.
Oct  2 07:11:36 np0005466012 systemd[1]: Started Session 6 of User zuul.
Oct  2 07:11:36 np0005466012 python3[4741]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  2 07:12:01 np0005466012 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:01 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:11 np0005466012 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:11 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:22 np0005466012 kernel: SELinux:  Converting 366 SID table entries...
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:22 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:23 np0005466012 setsebool[4801]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  2 07:12:23 np0005466012 setsebool[4801]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  2 07:12:36 np0005466012 kernel: SELinux:  Converting 369 SID table entries...
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:12:36 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:12:55 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  2 07:12:55 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:12:55 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:12:55 np0005466012 systemd[1]: Reloading.
Oct  2 07:12:55 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:12:55 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:12:56 np0005466012 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:12:56 np0005466012 systemd[1]: Starting Authorization Manager...
Oct  2 07:12:56 np0005466012 polkitd[6301]: Started polkitd version 0.117
Oct  2 07:12:56 np0005466012 systemd[1]: Started Authorization Manager.
Oct  2 07:12:56 np0005466012 systemd[1]: Started PackageKit Daemon.
Oct  2 07:13:14 np0005466012 python3[15585]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-ae2a-8cff-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:13:15 np0005466012 kernel: evm: overlay not supported
Oct  2 07:13:15 np0005466012 systemd[1095]: Starting D-Bus User Message Bus...
Oct  2 07:13:15 np0005466012 dbus-broker-launch[16087]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  2 07:13:15 np0005466012 dbus-broker-launch[16087]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  2 07:13:15 np0005466012 systemd[1095]: Started D-Bus User Message Bus.
Oct  2 07:13:15 np0005466012 dbus-broker-lau[16087]: Ready
Oct  2 07:13:15 np0005466012 systemd[1095]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  2 07:13:15 np0005466012 systemd[1095]: Created slice Slice /user.
Oct  2 07:13:15 np0005466012 systemd[1095]: podman-16015.scope: unit configures an IP firewall, but not running as root.
Oct  2 07:13:15 np0005466012 systemd[1095]: (This warning is only shown for the first unit using IP firewalling.)
Oct  2 07:13:15 np0005466012 systemd[1095]: Started podman-16015.scope.
Oct  2 07:13:15 np0005466012 systemd[1095]: Started podman-pause-9d21e03b.scope.
Oct  2 07:13:18 np0005466012 python3[17322]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.80:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.80:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:13:18 np0005466012 systemd[1]: session-6.scope: Deactivated successfully.
Oct  2 07:13:18 np0005466012 systemd[1]: session-6.scope: Consumed 1min 7.799s CPU time.
Oct  2 07:13:18 np0005466012 systemd-logind[827]: Session 6 logged out. Waiting for processes to exit.
Oct  2 07:13:18 np0005466012 systemd-logind[827]: Removed session 6.
Oct  2 07:13:47 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:13:47 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:13:47 np0005466012 systemd[1]: man-db-cache-update.service: Consumed 52.798s CPU time.
Oct  2 07:13:47 np0005466012 systemd[1]: run-r7c8bf46c2f864ed5a2655314e3d9ea63.service: Deactivated successfully.
Oct  2 07:13:50 np0005466012 systemd-logind[827]: New session 7 of user zuul.
Oct  2 07:13:50 np0005466012 systemd[1]: Started Session 7 of User zuul.
Oct  2 07:13:51 np0005466012 python3[26316]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5wXEu1JOQA5KJoTkupC8GEbQNIbg6S2Q6Mp50kFLAjQIUiHO0Vf9azsWL1hcnqZwbQOjTwG/mdjPHjLP6jQ28= zuul@np0005466010.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:13:51 np0005466012 python3[26342]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5wXEu1JOQA5KJoTkupC8GEbQNIbg6S2Q6Mp50kFLAjQIUiHO0Vf9azsWL1hcnqZwbQOjTwG/mdjPHjLP6jQ28= zuul@np0005466010.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:13:52 np0005466012 python3[26368]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005466012.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  2 07:13:53 np0005466012 python3[26402]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5wXEu1JOQA5KJoTkupC8GEbQNIbg6S2Q6Mp50kFLAjQIUiHO0Vf9azsWL1hcnqZwbQOjTwG/mdjPHjLP6jQ28= zuul@np0005466010.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  2 07:13:53 np0005466012 python3[26480]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:13:54 np0005466012 python3[26553]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759403633.549719-168-114928684193119/source _original_basename=tmp9dgzlaih follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:13:55 np0005466012 python3[26603]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Oct  2 07:13:55 np0005466012 systemd[1]: Starting Hostname Service...
Oct  2 07:13:55 np0005466012 systemd[1]: Started Hostname Service.
Oct  2 07:13:55 np0005466012 systemd-hostnamed[26607]: Changed pretty hostname to 'compute-1'
Oct  2 07:13:55 np0005466012 systemd-hostnamed[26607]: Hostname set to <compute-1> (static)
Oct  2 07:13:55 np0005466012 NetworkManager[4002]: <info>  [1759403635.3598] hostname: static hostname changed from "np0005466012.novalocal" to "compute-1"
Oct  2 07:13:55 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:13:55 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:13:56 np0005466012 systemd[1]: session-7.scope: Deactivated successfully.
Oct  2 07:13:56 np0005466012 systemd[1]: session-7.scope: Consumed 2.205s CPU time.
Oct  2 07:13:56 np0005466012 systemd-logind[827]: Session 7 logged out. Waiting for processes to exit.
Oct  2 07:13:56 np0005466012 systemd-logind[827]: Removed session 7.
Oct  2 07:14:05 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:14:25 np0005466012 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:18:02 np0005466012 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:18:19 np0005466012 systemd-logind[827]: New session 8 of user zuul.
Oct  2 07:18:19 np0005466012 systemd[1]: Started Session 8 of User zuul.
Oct  2 07:18:20 np0005466012 python3[26710]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:18:21 np0005466012 python3[26826]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:22 np0005466012 python3[26899]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=delorean.repo follow=False checksum=bb4c2ff9dad546f135d54d9729ea11b84117755d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:22 np0005466012 python3[26925]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:23 np0005466012 python3[26998]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:23 np0005466012 python3[27024]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:23 np0005466012 python3[27097]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:23 np0005466012 python3[27123]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:24 np0005466012 python3[27196]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:24 np0005466012 python3[27222]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:24 np0005466012 python3[27295]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:25 np0005466012 python3[27321]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:25 np0005466012 python3[27394]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:25 np0005466012 python3[27420]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  2 07:18:25 np0005466012 python3[27493]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759403901.6365113-30694-41441532958915/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=d911291791b114a72daf18f370e91cb1ae300933 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:18:35 np0005466012 python3[27541]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:23:34 np0005466012 systemd[1]: session-8.scope: Deactivated successfully.
Oct  2 07:23:34 np0005466012 systemd[1]: session-8.scope: Consumed 4.642s CPU time.
Oct  2 07:23:34 np0005466012 systemd-logind[827]: Session 8 logged out. Waiting for processes to exit.
Oct  2 07:23:34 np0005466012 systemd-logind[827]: Removed session 8.
Oct  2 07:32:51 np0005466012 systemd-logind[827]: New session 9 of user zuul.
Oct  2 07:32:51 np0005466012 systemd[1]: Started Session 9 of User zuul.
Oct  2 07:32:53 np0005466012 python3.9[27716]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:32:54 np0005466012 python3.9[27897]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:33:03 np0005466012 systemd[1]: session-9.scope: Deactivated successfully.
Oct  2 07:33:03 np0005466012 systemd[1]: session-9.scope: Consumed 7.699s CPU time.
Oct  2 07:33:03 np0005466012 systemd-logind[827]: Session 9 logged out. Waiting for processes to exit.
Oct  2 07:33:03 np0005466012 systemd-logind[827]: Removed session 9.
Oct  2 07:33:19 np0005466012 systemd-logind[827]: New session 10 of user zuul.
Oct  2 07:33:19 np0005466012 systemd[1]: Started Session 10 of User zuul.
Oct  2 07:33:20 np0005466012 python3.9[28108]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  2 07:33:21 np0005466012 python3.9[28282]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:22 np0005466012 python3.9[28434]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:33:23 np0005466012 python3.9[28587]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:33:24 np0005466012 python3.9[28739]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:33:25 np0005466012 python3.9[28891]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:33:25 np0005466012 python3.9[29014]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759404804.783065-183-95333549476118/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:33:26 np0005466012 python3.9[29166]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:27 np0005466012 python3.9[29322]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:33:28 np0005466012 python3.9[29472]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:33:33 np0005466012 python3.9[29727]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:33:34 np0005466012 python3.9[29877]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:35 np0005466012 python3.9[30031]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:33:36 np0005466012 python3.9[30189]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:33:37 np0005466012 python3.9[30273]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:34:26 np0005466012 systemd[1]: Reloading.
Oct  2 07:34:26 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:26 np0005466012 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  2 07:34:27 np0005466012 systemd[1]: Reloading.
Oct  2 07:34:27 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:27 np0005466012 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  2 07:34:27 np0005466012 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  2 07:34:27 np0005466012 systemd[1]: Reloading.
Oct  2 07:34:27 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:34:27 np0005466012 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  2 07:34:28 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:34:28 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:34:28 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:35:32 np0005466012 kernel: SELinux:  Converting 2714 SID table entries...
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:35:32 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:35:32 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  2 07:35:32 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:35:32 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:35:32 np0005466012 systemd[1]: Reloading.
Oct  2 07:35:32 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:35:32 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:35:33 np0005466012 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:35:33 np0005466012 systemd[1]: Started PackageKit Daemon.
Oct  2 07:35:33 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:35:33 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:35:33 np0005466012 systemd[1]: man-db-cache-update.service: Consumed 1.115s CPU time.
Oct  2 07:35:33 np0005466012 systemd[1]: run-r01de384875714799b75067c3f53c9be9.service: Deactivated successfully.
Oct  2 07:35:40 np0005466012 python3.9[31800]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:35:43 np0005466012 python3.9[32081]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  2 07:35:44 np0005466012 python3.9[32233]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  2 07:35:47 np0005466012 python3.9[32386]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:35:55 np0005466012 python3.9[32538]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  2 07:35:56 np0005466012 python3.9[32690]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:35:57 np0005466012 python3.9[32842]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:35:58 np0005466012 python3.9[32965]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759404957.270561-646-5094862796966/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:00 np0005466012 python3.9[33117]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  2 07:36:00 np0005466012 python3.9[33270]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:36:00 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:36:02 np0005466012 python3.9[33429]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:36:03 np0005466012 python3.9[33589]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  2 07:36:04 np0005466012 python3.9[33742]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:36:05 np0005466012 python3.9[33900]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  2 07:36:06 np0005466012 python3.9[34052]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:08 np0005466012 python3.9[34205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:09 np0005466012 python3.9[34357]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:09 np0005466012 python3.9[34480]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404968.6210725-930-66981305206963/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:10 np0005466012 python3.9[34632]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:36:11 np0005466012 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:36:11 np0005466012 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  2 07:36:11 np0005466012 kernel: Bridge firewalling registered
Oct  2 07:36:11 np0005466012 systemd-modules-load[34636]: Inserted module 'br_netfilter'
Oct  2 07:36:11 np0005466012 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:36:11 np0005466012 python3.9[34793]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:12 np0005466012 python3.9[34916]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759404971.2428744-999-143800429796002/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:13 np0005466012 python3.9[35068]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:16 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:36:16 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:36:16 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:36:16 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:36:16 np0005466012 systemd[1]: Reloading.
Oct  2 07:36:16 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:36:16 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:36:19 np0005466012 python3.9[37389]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:36:19 np0005466012 python3.9[38405]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  2 07:36:20 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:36:20 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:36:20 np0005466012 systemd[1]: man-db-cache-update.service: Consumed 4.782s CPU time.
Oct  2 07:36:20 np0005466012 systemd[1]: run-rd94b8cb5b40341028677e66a941d421e.service: Deactivated successfully.
Oct  2 07:36:20 np0005466012 python3.9[39080]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:36:21 np0005466012 python3.9[39233]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:21 np0005466012 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:36:21 np0005466012 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:36:22 np0005466012 python3.9[39606]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:36:22 np0005466012 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  2 07:36:22 np0005466012 systemd[1]: tuned.service: Deactivated successfully.
Oct  2 07:36:22 np0005466012 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  2 07:36:22 np0005466012 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  2 07:36:23 np0005466012 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  2 07:36:23 np0005466012 python3.9[39768]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  2 07:36:27 np0005466012 python3.9[39920]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:36:27 np0005466012 systemd[1]: Reloading.
Oct  2 07:36:27 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:36:28 np0005466012 python3.9[40109]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:36:28 np0005466012 systemd[1]: Reloading.
Oct  2 07:36:28 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:36:29 np0005466012 python3.9[40298]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:29 np0005466012 python3.9[40451]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:29 np0005466012 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  2 07:36:30 np0005466012 python3.9[40604]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:33 np0005466012 python3.9[40766]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:34 np0005466012 python3.9[40919]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:36:34 np0005466012 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  2 07:36:34 np0005466012 systemd[1]: Stopped Apply Kernel Variables.
Oct  2 07:36:34 np0005466012 systemd[1]: Stopping Apply Kernel Variables...
Oct  2 07:36:34 np0005466012 systemd[1]: Starting Apply Kernel Variables...
Oct  2 07:36:34 np0005466012 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  2 07:36:34 np0005466012 systemd[1]: Finished Apply Kernel Variables.
Oct  2 07:36:34 np0005466012 systemd[1]: session-10.scope: Deactivated successfully.
Oct  2 07:36:34 np0005466012 systemd[1]: session-10.scope: Consumed 2min 9.344s CPU time.
Oct  2 07:36:34 np0005466012 systemd-logind[827]: Session 10 logged out. Waiting for processes to exit.
Oct  2 07:36:34 np0005466012 systemd-logind[827]: Removed session 10.
Oct  2 07:36:40 np0005466012 systemd-logind[827]: New session 11 of user zuul.
Oct  2 07:36:40 np0005466012 systemd[1]: Started Session 11 of User zuul.
Oct  2 07:36:41 np0005466012 python3.9[41103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:42 np0005466012 python3.9[41257]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:44 np0005466012 python3.9[41413]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:45 np0005466012 python3.9[41564]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:46 np0005466012 python3.9[41720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:36:47 np0005466012 python3.9[41804]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:36:48 np0005466012 python3.9[41957]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:36:50 np0005466012 python3.9[42128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:51 np0005466012 python3.9[42280]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:36:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay-compat4179719418-merged.mount: Deactivated successfully.
Oct  2 07:36:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1291872952-merged.mount: Deactivated successfully.
Oct  2 07:36:51 np0005466012 podman[42281]: 2025-10-02 11:36:51.078529415 +0000 UTC m=+0.045521386 system refresh
Oct  2 07:36:52 np0005466012 python3.9[42443]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:52 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:36:52 np0005466012 python3.9[42566]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405011.3670309-293-43596927571135/.source.json follow=False _original_basename=podman_network_config.j2 checksum=4274be2eb40a6364296acf120acb1cbf6bdf7919 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:36:53 np0005466012 python3.9[42718]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:36:54 np0005466012 python3.9[42841]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405012.9164681-338-59019838057112/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a4fd3ca7d18166099562a65af8d6da655db34efc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:54 np0005466012 python3.9[42993]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:55 np0005466012 python3.9[43145]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:56 np0005466012 python3.9[43297]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:56 np0005466012 python3.9[43449]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:36:57 np0005466012 python3.9[43599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:36:58 np0005466012 python3.9[43753]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:00 np0005466012 python3.9[43906]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:03 np0005466012 python3.9[44066]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:05 np0005466012 python3.9[44219]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:07 np0005466012 python3.9[44372]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:10 np0005466012 python3.9[44528]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:14 np0005466012 python3.9[44696]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:16 np0005466012 python3.9[44849]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:37:29 np0005466012 python3.9[45186]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:37:30 np0005466012 python3.9[45361]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:37:30 np0005466012 python3.9[45484]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759405049.7929006-755-259254558516322/.source.json _original_basename=.7dhlsbrq follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:37:31 np0005466012 python3.9[45636]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:31 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:34 np0005466012 systemd[1]: var-lib-containers-storage-overlay-compat4167764342-lower\x2dmapped.mount: Deactivated successfully.
Oct  2 07:37:38 np0005466012 podman[45648]: 2025-10-02 11:37:38.768146058 +0000 UTC m=+6.879802905 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:37:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:39 np0005466012 python3.9[45944]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:39 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:41 np0005466012 podman[45956]: 2025-10-02 11:37:41.988357205 +0000 UTC m=+2.186180684 image pull ae232aa720979600656d94fc26ba957f1cdf5bca825fe9b57990f60c6534611f quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:37:41 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:42 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:42 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:43 np0005466012 python3.9[46211]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:37:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:44 np0005466012 podman[46222]: 2025-10-02 11:37:44.182549238 +0000 UTC m=+1.043536306 image pull d8d739f82a6fecf9df690e49539b589e74665b54e36448657b874630717d5bd1 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:37:44 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:44 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:44 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:37:45 np0005466012 python3.9[46458]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:38:01 np0005466012 podman[46468]: 2025-10-02 11:38:01.722342798 +0000 UTC m=+16.582637616 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:38:01 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:01 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:01 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:02 np0005466012 python3.9[46776]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:38:02 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:05 np0005466012 podman[46789]: 2025-10-02 11:38:05.369077854 +0000 UTC m=+2.572950467 image pull 5f0622bc7c13827171d93b3baf72157e23d24d44579ad79fe3a89ad88180a4bb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  2 07:38:05 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:05 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:05 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:06 np0005466012 python3.9[47042]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  2 07:38:07 np0005466012 podman[47055]: 2025-10-02 11:38:07.417268413 +0000 UTC m=+1.236147756 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  2 07:38:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:38:09 np0005466012 systemd[1]: session-11.scope: Deactivated successfully.
Oct  2 07:38:09 np0005466012 systemd[1]: session-11.scope: Consumed 1min 28.873s CPU time.
Oct  2 07:38:09 np0005466012 systemd-logind[827]: Session 11 logged out. Waiting for processes to exit.
Oct  2 07:38:09 np0005466012 systemd-logind[827]: Removed session 11.
Oct  2 07:38:14 np0005466012 systemd-logind[827]: New session 12 of user zuul.
Oct  2 07:38:14 np0005466012 systemd[1]: Started Session 12 of User zuul.
Oct  2 07:38:15 np0005466012 python3.9[47355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:17 np0005466012 python3.9[47511]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  2 07:38:18 np0005466012 python3.9[47664]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:38:19 np0005466012 python3.9[47822]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:38:20 np0005466012 python3.9[47982]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:38:21 np0005466012 python3.9[48066]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:38:24 np0005466012 python3.9[48227]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:37 np0005466012 kernel: SELinux:  Converting 2725 SID table entries...
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:38:37 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:38:37 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  2 07:38:37 np0005466012 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  2 07:38:38 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:38:38 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:38:38 np0005466012 systemd[1]: Reloading.
Oct  2 07:38:38 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:38:38 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:38:38 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:38:39 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:38:39 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:38:39 np0005466012 systemd[1]: run-r5677bac64eb2476f8b604fcddbd7f37e.service: Deactivated successfully.
Oct  2 07:38:42 np0005466012 python3.9[49330]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:38:42 np0005466012 systemd[1]: Reloading.
Oct  2 07:38:42 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:38:42 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:38:43 np0005466012 systemd[1]: Starting Open vSwitch Database Unit...
Oct  2 07:38:43 np0005466012 chown[49373]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  2 07:38:43 np0005466012 ovs-ctl[49378]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  2 07:38:43 np0005466012 ovs-ctl[49378]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-ctl[49378]: Starting ovsdb-server [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-vsctl[49427]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  2 07:38:43 np0005466012 ovs-vsctl[49447]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  2 07:38:43 np0005466012 ovs-ctl[49378]: Configuring Open vSwitch system IDs [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-ctl[49378]: Enabling remote OVSDB managers [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-vsctl[49453]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Oct  2 07:38:43 np0005466012 systemd[1]: Started Open vSwitch Database Unit.
Oct  2 07:38:43 np0005466012 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  2 07:38:43 np0005466012 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  2 07:38:43 np0005466012 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  2 07:38:43 np0005466012 kernel: openvswitch: Open vSwitch switching datapath
Oct  2 07:38:43 np0005466012 ovs-ctl[49497]: Inserting openvswitch module [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-ctl[49466]: Starting ovs-vswitchd [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-ctl[49466]: Enabling remote OVSDB managers [  OK  ]
Oct  2 07:38:43 np0005466012 ovs-vsctl[49515]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Oct  2 07:38:43 np0005466012 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  2 07:38:43 np0005466012 systemd[1]: Starting Open vSwitch...
Oct  2 07:38:43 np0005466012 systemd[1]: Finished Open vSwitch.
Oct  2 07:38:44 np0005466012 python3.9[49666]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:45 np0005466012 python3.9[49818]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  2 07:38:47 np0005466012 kernel: SELinux:  Converting 2739 SID table entries...
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:38:47 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:38:48 np0005466012 python3.9[49974]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:38:48 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  2 07:38:49 np0005466012 python3.9[50132]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:51 np0005466012 python3.9[50285]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:38:52 np0005466012 python3.9[50572]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:38:53 np0005466012 python3.9[50722]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:38:53 np0005466012 python3.9[50876]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:38:55 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:38:55 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:38:55 np0005466012 systemd[1]: Reloading.
Oct  2 07:38:55 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:38:55 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:38:55 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:38:56 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:38:56 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:38:56 np0005466012 systemd[1]: run-r765b64293aca464ca5defb7584515763.service: Deactivated successfully.
Oct  2 07:38:57 np0005466012 python3.9[51192]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:38:57 np0005466012 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  2 07:38:57 np0005466012 systemd[1]: Stopped Network Manager Wait Online.
Oct  2 07:38:57 np0005466012 systemd[1]: Stopping Network Manager Wait Online...
Oct  2 07:38:57 np0005466012 systemd[1]: Stopping Network Manager...
Oct  2 07:38:57 np0005466012 NetworkManager[4002]: <info>  [1759405137.7346] caught SIGTERM, shutting down normally.
Oct  2 07:38:57 np0005466012 NetworkManager[4002]: <info>  [1759405137.7362] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:38:57 np0005466012 NetworkManager[4002]: <info>  [1759405137.7362] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:38:57 np0005466012 NetworkManager[4002]: <info>  [1759405137.7362] dhcp4 (eth0): state changed no lease
Oct  2 07:38:57 np0005466012 NetworkManager[4002]: <info>  [1759405137.7364] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:38:57 np0005466012 NetworkManager[4002]: <info>  [1759405137.7431] exiting (success)
Oct  2 07:38:57 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:38:57 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:38:57 np0005466012 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  2 07:38:57 np0005466012 systemd[1]: Stopped Network Manager.
Oct  2 07:38:57 np0005466012 systemd[1]: NetworkManager.service: Consumed 12.241s CPU time, 4.3M memory peak, read 0B from disk, written 23.0K to disk.
Oct  2 07:38:57 np0005466012 systemd[1]: Starting Network Manager...
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.8104] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:4de76784-23e0-4a0b-a470-2209f5c6de9a)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.8107] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.8162] manager[0x55e7879a5090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  2 07:38:57 np0005466012 systemd[1]: Starting Hostname Service...
Oct  2 07:38:57 np0005466012 systemd[1]: Started Hostname Service.
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9266] hostname: hostname: using hostnamed
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9268] hostname: static hostname changed from (none) to "compute-1"
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9273] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9277] manager[0x55e7879a5090]: rfkill: Wi-Fi hardware radio set enabled
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9277] manager[0x55e7879a5090]: rfkill: WWAN hardware radio set enabled
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9296] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9304] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9304] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9305] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9305] manager: Networking is enabled by state file
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9307] settings: Loaded settings plugin: keyfile (internal)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9310] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9334] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9343] dhcp: init: Using DHCP client 'internal'
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9345] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9350] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9354] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9362] device (lo): Activation: starting connection 'lo' (a2f56833-ce60-4e44-b803-7556fb2bfbcb)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9370] device (eth0): carrier: link connected
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9374] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9377] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9378] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9383] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9389] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9394] device (eth1): carrier: link connected
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9397] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9401] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (7858c0cd-f79e-526d-9711-eb64af0637f1) (indicated)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9401] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9405] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9410] device (eth1): Activation: starting connection 'ci-private-network' (7858c0cd-f79e-526d-9711-eb64af0637f1)
Oct  2 07:38:57 np0005466012 systemd[1]: Started Network Manager.
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9416] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9422] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9423] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9424] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9426] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9429] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9431] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9433] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9438] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9444] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9446] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9463] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9474] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9479] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9481] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9487] device (lo): Activation: successful, device activated.
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9497] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9499] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9502] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9507] device (eth1): Activation: successful, device activated.
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9515] dhcp4 (eth0): state changed new lease, address=38.102.83.181
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9520] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  2 07:38:57 np0005466012 systemd[1]: Starting Network Manager Wait Online...
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9590] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9619] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9620] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9623] manager: NetworkManager state is now CONNECTED_SITE
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9628] device (eth0): Activation: successful, device activated.
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9631] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  2 07:38:57 np0005466012 NetworkManager[51207]: <info>  [1759405137.9633] manager: startup complete
Oct  2 07:38:57 np0005466012 systemd[1]: Finished Network Manager Wait Online.
Oct  2 07:38:58 np0005466012 python3.9[51418]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:39:03 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:39:03 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:39:03 np0005466012 systemd[1]: Reloading.
Oct  2 07:39:03 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:39:03 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:39:03 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:39:04 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:39:04 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:39:04 np0005466012 systemd[1]: run-r0fd27da53f914cf5a1d584155b09f9e6.service: Deactivated successfully.
Oct  2 07:39:08 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:39:08 np0005466012 python3.9[51881]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:39:09 np0005466012 python3.9[52033]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:10 np0005466012 python3.9[52187]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:11 np0005466012 python3.9[52339]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:11 np0005466012 python3.9[52491]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:12 np0005466012 python3.9[52643]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:12 np0005466012 python3.9[52795]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:13 np0005466012 python3.9[52918]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405152.4737208-652-64545895962393/.source _original_basename=.yngws_em follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:14 np0005466012 python3.9[53070]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:14 np0005466012 python3.9[53222]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  2 07:39:15 np0005466012 python3.9[53374]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:17 np0005466012 python3.9[53802]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  2 07:39:18 np0005466012 ansible-async_wrapper.py[53977]: Invoked with j276392630293 300 /home/zuul/.ansible/tmp/ansible-tmp-1759405158.1273577-850-196284598642417/AnsiballZ_edpm_os_net_config.py _
Oct  2 07:39:18 np0005466012 ansible-async_wrapper.py[53980]: Starting module and watcher
Oct  2 07:39:18 np0005466012 ansible-async_wrapper.py[53980]: Start watching 53981 (300)
Oct  2 07:39:18 np0005466012 ansible-async_wrapper.py[53981]: Start module (53981)
Oct  2 07:39:18 np0005466012 ansible-async_wrapper.py[53977]: Return async_wrapper task started.
Oct  2 07:39:19 np0005466012 python3.9[53982]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  2 07:39:19 np0005466012 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  2 07:39:19 np0005466012 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  2 07:39:19 np0005466012 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  2 07:39:19 np0005466012 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  2 07:39:19 np0005466012 kernel: cfg80211: failed to load regulatory.db
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.6550] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.6571] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7074] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7077] audit: op="connection-add" uuid="bc9d129d-5637-41eb-900e-4166bb258e1b" name="br-ex-br" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7091] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7092] audit: op="connection-add" uuid="b34dba39-599b-4b64-80cc-362c158ce2ca" name="br-ex-port" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7103] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7104] audit: op="connection-add" uuid="57525fb3-85c4-4200-8a04-ca19e7e349b8" name="eth1-port" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7114] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7116] audit: op="connection-add" uuid="2dd0ba82-345a-44f5-8a2e-e798b5f9cf13" name="vlan20-port" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7126] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7127] audit: op="connection-add" uuid="ec6b9f11-8d2d-4b4d-af13-d47084d7a5e1" name="vlan21-port" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7139] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7141] audit: op="connection-add" uuid="9e890580-ca90-4b8d-921d-9e0a5938cc3b" name="vlan22-port" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7159] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7173] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7175] audit: op="connection-add" uuid="0665a6b4-5364-46bb-ba19-7cfe7acf964a" name="br-ex-if" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7227] audit: op="connection-update" uuid="7858c0cd-f79e-526d-9711-eb64af0637f1" name="ci-private-network" args="connection.slave-type,connection.port-type,connection.controller,connection.master,connection.timestamp,ipv4.addresses,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.routes,ipv4.never-default,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ovs-external-ids.data,ovs-interface.type" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7242] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7244] audit: op="connection-add" uuid="0ea64dbe-d6c2-4fd9-87ab-13d4087d132e" name="vlan20-if" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7258] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7260] audit: op="connection-add" uuid="e53f6719-7950-4d27-af32-c3098f27f7af" name="vlan21-if" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7274] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7275] audit: op="connection-add" uuid="e9814138-0c5a-4057-94d3-9e58d60db274" name="vlan22-if" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7288] audit: op="connection-delete" uuid="04885c6e-a4ca-3823-855d-920bcf370929" name="Wired connection 1" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7299] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7316] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7319] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (bc9d129d-5637-41eb-900e-4166bb258e1b)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7322] audit: op="connection-activate" uuid="bc9d129d-5637-41eb-900e-4166bb258e1b" name="br-ex-br" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7324] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7333] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7338] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b34dba39-599b-4b64-80cc-362c158ce2ca)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7341] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7348] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7353] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (57525fb3-85c4-4200-8a04-ca19e7e349b8)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7355] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7364] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7369] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (2dd0ba82-345a-44f5-8a2e-e798b5f9cf13)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7372] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7381] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7387] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ec6b9f11-8d2d-4b4d-af13-d47084d7a5e1)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7390] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7399] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7404] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (9e890580-ca90-4b8d-921d-9e0a5938cc3b)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7407] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7409] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7413] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7418] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7425] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7431] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (0665a6b4-5364-46bb-ba19-7cfe7acf964a)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7433] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7436] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7439] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7440] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7440] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7448] device (eth1): disconnecting for new activation request.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7449] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7450] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7451] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7454] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7456] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7459] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7462] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (0ea64dbe-d6c2-4fd9-87ab-13d4087d132e)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7463] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7465] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7467] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7468] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7471] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7474] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7477] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (e53f6719-7950-4d27-af32-c3098f27f7af)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7478] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7481] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7482] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7484] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7487] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7491] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7495] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e9814138-0c5a-4057-94d3-9e58d60db274)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7497] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7500] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7502] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7503] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7505] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7515] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.method,ipv6.addr-gen-mode" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7517] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7520] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7522] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7527] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7531] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7534] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7536] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7538] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7543] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 kernel: ovs-system: entered promiscuous mode
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7547] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7548] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7550] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7554] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7560] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7564] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7566] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7571] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7575] dhcp4 (eth0): canceled DHCP transaction
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7576] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  2 07:39:20 np0005466012 kernel: Timeout policy base is empty
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7576] dhcp4 (eth0): state changed no lease
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7578] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  2 07:39:20 np0005466012 systemd-udevd[53988]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7590] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7593] audit: op="device-reapply" interface="eth1" ifindex=3 pid=53983 uid=0 result="fail" reason="Device is not activated"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7622] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7628] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7632] dhcp4 (eth0): state changed new lease, address=38.102.83.181
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7635] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7682] device (eth1): disconnecting for new activation request.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7683] audit: op="connection-activate" uuid="7858c0cd-f79e-526d-9711-eb64af0637f1" name="ci-private-network" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7731] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53983 uid=0 result="success"
Oct  2 07:39:20 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7766] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7781] device (eth1): Activation: starting connection 'ci-private-network' (7858c0cd-f79e-526d-9711-eb64af0637f1)
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7785] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7790] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7795] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7801] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7805] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7812] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7819] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7820] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7821] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7823] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7824] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7826] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7830] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7833] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7837] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7840] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7844] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7848] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7851] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7854] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7858] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7862] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7865] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7907] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7909] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.7914] device (eth1): Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 kernel: br-ex: entered promiscuous mode
Oct  2 07:39:20 np0005466012 kernel: vlan22: entered promiscuous mode
Oct  2 07:39:20 np0005466012 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  2 07:39:20 np0005466012 systemd-udevd[53989]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:39:20 np0005466012 kernel: vlan20: entered promiscuous mode
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8148] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8162] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8181] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8190] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 kernel: vlan21: entered promiscuous mode
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8230] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8232] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8240] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8254] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8254] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8259] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8270] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8285] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8328] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8329] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8331] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8336] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8352] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8385] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8386] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  2 07:39:20 np0005466012 NetworkManager[51207]: <info>  [1759405160.8392] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  2 07:39:21 np0005466012 NetworkManager[51207]: <info>  [1759405161.9287] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.0501] checkpoint[0x55e78797a950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.0503] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.2838] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.2848] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.4588] audit: op="networking-control" arg="global-dns-configuration" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.4634] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.4697] audit: op="networking-control" arg="global-dns-configuration" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.4720] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.6013] checkpoint[0x55e78797aa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  2 07:39:22 np0005466012 NetworkManager[51207]: <info>  [1759405162.6016] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=53983 uid=0 result="success"
Oct  2 07:39:22 np0005466012 ansible-async_wrapper.py[53981]: Module complete (53981)
Oct  2 07:39:22 np0005466012 python3.9[54321]: ansible-ansible.legacy.async_status Invoked with jid=j276392630293.53977 mode=status _async_dir=/root/.ansible_async
Oct  2 07:39:23 np0005466012 python3.9[54420]: ansible-ansible.legacy.async_status Invoked with jid=j276392630293.53977 mode=cleanup _async_dir=/root/.ansible_async
Oct  2 07:39:23 np0005466012 ansible-async_wrapper.py[53980]: Done in kid B.
Oct  2 07:39:27 np0005466012 python3.9[54574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:27 np0005466012 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 07:39:27 np0005466012 python3.9[54697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405167.0250094-926-265944864839026/.source.returncode _original_basename=.8asbw1j2 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:28 np0005466012 python3.9[54851]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:29 np0005466012 python3.9[54974]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405168.2252223-974-246160308943968/.source.cfg _original_basename=.3d2p4a6i follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:29 np0005466012 python3.9[55127]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:39:29 np0005466012 systemd[1]: Reloading Network Manager...
Oct  2 07:39:29 np0005466012 NetworkManager[51207]: <info>  [1759405169.9964] audit: op="reload" arg="0" pid=55131 uid=0 result="success"
Oct  2 07:39:29 np0005466012 NetworkManager[51207]: <info>  [1759405169.9973] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  2 07:39:30 np0005466012 systemd[1]: Reloaded Network Manager.
Oct  2 07:39:30 np0005466012 systemd[1]: session-12.scope: Deactivated successfully.
Oct  2 07:39:30 np0005466012 systemd[1]: session-12.scope: Consumed 45.852s CPU time.
Oct  2 07:39:30 np0005466012 systemd-logind[827]: Session 12 logged out. Waiting for processes to exit.
Oct  2 07:39:30 np0005466012 systemd-logind[827]: Removed session 12.
Oct  2 07:39:35 np0005466012 systemd-logind[827]: New session 13 of user zuul.
Oct  2 07:39:35 np0005466012 systemd[1]: Started Session 13 of User zuul.
Oct  2 07:39:36 np0005466012 python3.9[55315]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:37 np0005466012 python3.9[55469]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:38 np0005466012 python3.9[55658]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:39:39 np0005466012 systemd[1]: session-13.scope: Deactivated successfully.
Oct  2 07:39:39 np0005466012 systemd[1]: session-13.scope: Consumed 2.224s CPU time.
Oct  2 07:39:39 np0005466012 systemd-logind[827]: Session 13 logged out. Waiting for processes to exit.
Oct  2 07:39:39 np0005466012 systemd-logind[827]: Removed session 13.
Oct  2 07:39:40 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:39:44 np0005466012 systemd-logind[827]: New session 14 of user zuul.
Oct  2 07:39:44 np0005466012 systemd[1]: Started Session 14 of User zuul.
Oct  2 07:39:45 np0005466012 python3.9[55841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:46 np0005466012 python3.9[55996]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:39:47 np0005466012 python3.9[56152]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:48 np0005466012 python3.9[56236]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:39:50 np0005466012 python3.9[56390]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:39:51 np0005466012 python3.9[56581]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:52 np0005466012 python3.9[56734]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:39:52 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:39:53 np0005466012 python3.9[56897]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:53 np0005466012 python3.9[56975]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:39:54 np0005466012 python3.9[57128]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:39:54 np0005466012 python3.9[57206]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:55 np0005466012 python3.9[57358]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:55 np0005466012 python3.9[57510]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:56 np0005466012 python3.9[57662]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:57 np0005466012 python3.9[57814]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:39:57 np0005466012 python3.9[57966]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:40:00 np0005466012 python3.9[58120]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:00 np0005466012 python3.9[58275]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:40:01 np0005466012 python3.9[58427]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:40:02 np0005466012 python3.9[58579]: ansible-service_facts Invoked
Oct  2 07:40:02 np0005466012 network[58596]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:40:02 np0005466012 network[58597]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:40:02 np0005466012 network[58598]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:40:08 np0005466012 python3.9[59052]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:40:10 np0005466012 python3.9[59205]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  2 07:40:12 np0005466012 python3.9[59358]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:13 np0005466012 python3.9[59483]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405212.1007595-627-275907887308576/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:13 np0005466012 python3.9[59637]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:14 np0005466012 python3.9[59762]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405213.560964-671-247751767036867/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:16 np0005466012 python3.9[59916]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:17 np0005466012 python3.9[60070]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:40:19 np0005466012 python3.9[60154]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:20 np0005466012 python3.9[60310]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:40:21 np0005466012 python3.9[60394]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:40:21 np0005466012 chronyd[836]: chronyd exiting
Oct  2 07:40:21 np0005466012 systemd[1]: Stopping NTP client/server...
Oct  2 07:40:21 np0005466012 systemd[1]: chronyd.service: Deactivated successfully.
Oct  2 07:40:21 np0005466012 systemd[1]: Stopped NTP client/server.
Oct  2 07:40:21 np0005466012 systemd[1]: Starting NTP client/server...
Oct  2 07:40:21 np0005466012 chronyd[60403]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  2 07:40:21 np0005466012 chronyd[60403]: Frequency -27.192 +/- 0.222 ppm read from /var/lib/chrony/drift
Oct  2 07:40:21 np0005466012 chronyd[60403]: Loaded seccomp filter (level 2)
Oct  2 07:40:21 np0005466012 systemd[1]: Started NTP client/server.
Oct  2 07:40:22 np0005466012 systemd[1]: session-14.scope: Deactivated successfully.
Oct  2 07:40:22 np0005466012 systemd-logind[827]: Session 14 logged out. Waiting for processes to exit.
Oct  2 07:40:22 np0005466012 systemd[1]: session-14.scope: Consumed 23.379s CPU time.
Oct  2 07:40:22 np0005466012 systemd-logind[827]: Removed session 14.
Oct  2 07:40:27 np0005466012 systemd-logind[827]: New session 15 of user zuul.
Oct  2 07:40:27 np0005466012 systemd[1]: Started Session 15 of User zuul.
Oct  2 07:40:28 np0005466012 python3.9[60582]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:40:29 np0005466012 python3.9[60738]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:30 np0005466012 python3.9[60913]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:30 np0005466012 python3.9[60991]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.guj66qhb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:32 np0005466012 python3.9[61143]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:32 np0005466012 python3.9[61266]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405231.5099196-149-225398232062092/.source _original_basename=.3aazu9l7 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:33 np0005466012 python3.9[61418]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:40:34 np0005466012 python3.9[61570]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:34 np0005466012 python3.9[61693]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405233.734643-221-93417464560537/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:40:35 np0005466012 python3.9[61845]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:36 np0005466012 python3.9[61968]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405235.0068226-221-9601116235567/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:40:36 np0005466012 python3.9[62120]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:37 np0005466012 python3.9[62272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:38 np0005466012 python3.9[62395]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405237.1459985-332-241284899802044/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:38 np0005466012 python3.9[62547]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:39 np0005466012 python3.9[62670]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405238.4283834-377-43080515839529/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:40 np0005466012 python3.9[62822]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:40 np0005466012 systemd[1]: Reloading.
Oct  2 07:40:40 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:40 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:40 np0005466012 systemd[1]: Reloading.
Oct  2 07:40:40 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:40 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:41 np0005466012 systemd[1]: Starting EDPM Container Shutdown...
Oct  2 07:40:41 np0005466012 systemd[1]: Finished EDPM Container Shutdown.
Oct  2 07:40:41 np0005466012 python3.9[63048]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:42 np0005466012 python3.9[63171]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405241.2625196-446-73088008204473/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:43 np0005466012 python3.9[63323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:43 np0005466012 python3.9[63446]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405242.5643697-491-129372672891503/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:40:44 np0005466012 python3.9[63598]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:44 np0005466012 systemd[1]: Reloading.
Oct  2 07:40:44 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:44 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:44 np0005466012 systemd[1]: Reloading.
Oct  2 07:40:44 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:44 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:45 np0005466012 systemd[1]: Starting Create netns directory...
Oct  2 07:40:45 np0005466012 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:40:45 np0005466012 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:40:45 np0005466012 systemd[1]: Finished Create netns directory.
Oct  2 07:40:45 np0005466012 python3.9[63825]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:40:45 np0005466012 network[63842]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:40:45 np0005466012 network[63843]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:40:45 np0005466012 network[63844]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:40:54 np0005466012 python3.9[64108]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:54 np0005466012 systemd[1]: Reloading.
Oct  2 07:40:54 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:54 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:54 np0005466012 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  2 07:40:54 np0005466012 iptables.init[64149]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  2 07:40:55 np0005466012 iptables.init[64149]: iptables: Flushing firewall rules: [  OK  ]
Oct  2 07:40:55 np0005466012 systemd[1]: iptables.service: Deactivated successfully.
Oct  2 07:40:55 np0005466012 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  2 07:40:55 np0005466012 python3.9[64346]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:56 np0005466012 python3.9[64500]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:40:56 np0005466012 systemd[1]: Reloading.
Oct  2 07:40:56 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:40:56 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:40:56 np0005466012 systemd[1]: Starting Netfilter Tables...
Oct  2 07:40:56 np0005466012 systemd[1]: Finished Netfilter Tables.
Oct  2 07:40:57 np0005466012 python3.9[64692]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:40:58 np0005466012 python3.9[64845]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:40:59 np0005466012 python3.9[64970]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405258.3532417-698-84602347772983/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:00 np0005466012 python3.9[65121]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:41:25 np0005466012 systemd-logind[827]: Session 15 logged out. Waiting for processes to exit.
Oct  2 07:41:25 np0005466012 systemd[1]: session-15.scope: Deactivated successfully.
Oct  2 07:41:25 np0005466012 systemd[1]: session-15.scope: Consumed 19.465s CPU time.
Oct  2 07:41:25 np0005466012 systemd-logind[827]: Removed session 15.
Oct  2 07:41:38 np0005466012 systemd-logind[827]: New session 16 of user zuul.
Oct  2 07:41:38 np0005466012 systemd[1]: Started Session 16 of User zuul.
Oct  2 07:41:39 np0005466012 python3.9[65315]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:41:40 np0005466012 python3.9[65471]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:41 np0005466012 python3.9[65646]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:41 np0005466012 python3.9[65724]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ax6n2jl7 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:42 np0005466012 python3.9[65876]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:42 np0005466012 python3.9[65954]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.ejv_i3ia recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:43 np0005466012 python3.9[66106]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:44 np0005466012 python3.9[66258]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:44 np0005466012 python3.9[66336]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:45 np0005466012 python3.9[66488]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:46 np0005466012 python3.9[66566]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:41:46 np0005466012 python3.9[66718]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:47 np0005466012 python3.9[66870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:47 np0005466012 python3.9[66948]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:48 np0005466012 python3.9[67100]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:48 np0005466012 python3.9[67178]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:50 np0005466012 python3.9[67330]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:41:50 np0005466012 systemd[1]: Reloading.
Oct  2 07:41:50 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:41:50 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:41:51 np0005466012 python3.9[67520]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:51 np0005466012 python3.9[67598]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:52 np0005466012 python3.9[67750]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:41:52 np0005466012 python3.9[67828]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:41:53 np0005466012 python3.9[67980]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:41:53 np0005466012 systemd[1]: Reloading.
Oct  2 07:41:53 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:41:53 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:41:53 np0005466012 systemd[1]: Starting Create netns directory...
Oct  2 07:41:53 np0005466012 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:41:53 np0005466012 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:41:53 np0005466012 systemd[1]: Finished Create netns directory.
Oct  2 07:41:55 np0005466012 python3.9[68171]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:41:55 np0005466012 network[68188]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:41:55 np0005466012 network[68189]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:41:55 np0005466012 network[68190]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:42:00 np0005466012 python3.9[68453]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:00 np0005466012 python3.9[68531]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:01 np0005466012 python3.9[68683]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:02 np0005466012 python3.9[68835]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:03 np0005466012 python3.9[68958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405321.8116026-614-51944678315233/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:04 np0005466012 python3.9[69110]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  2 07:42:04 np0005466012 systemd[1]: Starting Time & Date Service...
Oct  2 07:42:04 np0005466012 systemd[1]: Started Time & Date Service.
Oct  2 07:42:04 np0005466012 python3.9[69266]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:05 np0005466012 python3.9[69418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:06 np0005466012 python3.9[69541]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405325.2367141-719-256416931211842/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:06 np0005466012 python3.9[69693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:07 np0005466012 python3.9[69816]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405326.4991922-764-152649876165674/.source.yaml _original_basename=.p28zjgrh follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:08 np0005466012 python3.9[69968]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:08 np0005466012 python3.9[70091]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405327.6060708-809-30932494206539/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:09 np0005466012 python3.9[70243]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:10 np0005466012 python3.9[70396]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:10 np0005466012 python3[70549]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:42:11 np0005466012 python3.9[70701]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:12 np0005466012 python3.9[70824]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405331.1628375-926-2833564950743/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:13 np0005466012 python3.9[70976]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:13 np0005466012 python3.9[71099]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405332.517217-971-172225735958704/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:14 np0005466012 python3.9[71251]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:14 np0005466012 python3.9[71374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405333.8289666-1016-167662110509720/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:15 np0005466012 python3.9[71526]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:16 np0005466012 python3.9[71649]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405335.5045242-1061-149231282271518/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:17 np0005466012 python3.9[71801]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:42:17 np0005466012 python3.9[71924]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405336.7532165-1106-165544514231289/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:18 np0005466012 python3.9[72076]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:19 np0005466012 python3.9[72228]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:20 np0005466012 python3.9[72387]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:21 np0005466012 python3.9[72540]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:21 np0005466012 python3.9[72692]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:23 np0005466012 python3.9[72844]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:42:23 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:42:23 np0005466012 python3.9[72998]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  2 07:42:24 np0005466012 systemd[1]: session-16.scope: Deactivated successfully.
Oct  2 07:42:24 np0005466012 systemd[1]: session-16.scope: Consumed 30.604s CPU time.
Oct  2 07:42:24 np0005466012 systemd-logind[827]: Session 16 logged out. Waiting for processes to exit.
Oct  2 07:42:24 np0005466012 systemd-logind[827]: Removed session 16.
Oct  2 07:42:29 np0005466012 systemd-logind[827]: New session 17 of user zuul.
Oct  2 07:42:29 np0005466012 systemd[1]: Started Session 17 of User zuul.
Oct  2 07:42:30 np0005466012 python3.9[73179]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  2 07:42:30 np0005466012 python3.9[73331]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:42:31 np0005466012 chronyd[60403]: Selected source 23.133.168.246 (pool.ntp.org)
Oct  2 07:42:32 np0005466012 python3.9[73483]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:33 np0005466012 python3.9[73635]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCVkUIA0SGLhushMFSLFSAWpWCX1FF5YUjql8/6tMZQcpzUyU7mJOEQY7Jf3ZvoRVMiETNv8NaicCQ10qaPGZQwEamylEkW24WAdEJ+0NDO/DPkUTIp6vmhyqMNK8IeoLM1RrAM82pBxdQ+jut498Pj6OeLzo75U5X+AQp3kNKD6nnt+JeBNs5kT35nF/5InhW1d2N5LWKKnnw2LJIgpPZkpDwuRAOTnEp/nyNR1NyRQY1VpGMuAXgEkvvu1no1xBYM2lnfNEwn46Bcfr8p+n5Jv3gJBcteKnTCaLF0CagpfSTcvar4pcN97zXX4Jlq0VyVjit+YemnX5EnCaQoK6sYtatkGsRooS56wc+WtVHhf155ZIAj8wPRwWpcXZq+EV0SwoTFwUUNTXToz7qscdq04OHTl0bFRFQevmks+w6V4a7CzQa1/eeGlYdGEUS1I0dC5eeHDewjoLwo5+ufxHrbBmxaZrgbtwk1E9MQqj3PmdFlh17a83VHQwat591/QWU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKeqMQGrAV3pXZcV6Ore8xolY214SO0KlbtK5lvj/17F#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM+LLWBT7ZHcUDyX8Xq/MZx34NXsN2QLd9BzdUzQgHmTREhCHesKInMqP8HfljOxzmUfohPV1AQVEYpXvhkaaQM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpSx4Pw7AZfRlGUxa5vBESKqssXVQvrJz1cHMKGIXXt2c6o14yjlDJQkQZMnezLwc2Chr5fJ4DbiWklwKtNLctbQXR5ygqWs0bxMEnUYw+SjtdNwhykNDKkOJF64+ZtokEdpLHge0NvMivE2EBqu3TeXUji1OpHV3NGMiFKFwb0YsujbJuPjzPh6igp8NPD3uwcNrf+rcVQz8qlT/9rxdBMoyNjDoha3HCOOQDoColV7DbtQNdDBy+PMi8DOqzRJ/iPi7C26lVo+1xQL/ZKdmOOijv/QkqsY3ejuzIO9w3z3+GuykWEdEzm3EkUZJ8Q48/OwksBIdmcOC2Ke46PTLmftlRsdK0YUy7UyzGX7HQ++JYiTXyXN92ieFxNY3MmKu/70/67TT90mqVUOkZ9C32ixYMvj2hhnxS5+bmnMjpwCkUvgS1BmmSof6ghFjYZsP6zgTonqOtP5gt5VLjy7xNuApqVGmSN09/ExnZcGBX3ymXsxepc6spJeZ7hw2P4E8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILxXcYZxs19Ipxj4mIzt7SBi+8WzNq9W70+VNtppPYi1#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDZWnRJzd2ypotxaluhYES+V8G+b3/YU1LqQdpTWOWSO1QiTR0RJRiCt3KgKfluISOv8H6sHrJ9PKv84heszJQY=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC30XwCLl481RnJGbLpEu8HK5UD53phC9aWXrs97/vSr9LY5Wlu1FrcSpDZK7wWwcUjs+Ug5XBJCr6avXKE4rjPwk89lQ1q9g/H9bpxdA4xrV5Eoc6riCUU7Ig86tKNwjKxxe5YXXkbQXzO1m31FHYpGh6MsVqR+sdC4B+xoAW7BJ+sTbHJ0l17YcK68hwv9ZNXBecuDjZDvLtDNje8ZGmmlUIAQ9MfLqzQr0EclCOAdN+tu1Se7EQ/8vqrT6CSp6hCSBXg2bK7fPi0mqJ1MgA1xig5gH2fONZWMZ9gDEbfhr3UMzXKiB9YuhIx/xfPq174TvmMwN89+fteCUEl7FYK0+huTyjiBNyHBhniq+ndB0camrvH6y1i0qFjY2JAZ9zt1odn0an1VRX2fLnwHlbLgEzV7kFf7kzLvc38F4Hd2a4K7/W8rJ80hL4T0aYiPZvbt0T6Z8dKMiNdh5Uq6HXxMW3HhGZER30lJh4bTzzwRBwMlgFLe4nxKXKtNZggdHU=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDQRjtGGFpYzrfHwb+9O0hMfMhijlzqGxkH0vMapGQGq#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLy3xOsuqZD05zHjHYtORv2L5Dy5w2gv1l1NTxi4JLb2kboxAJmGY6ewcs/tttddwUtZ4hxQZpPqVyCmq+Pg//I=#012 create=True mode=0644 path=/tmp/ansible.bz93ie5t state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:34 np0005466012 python3.9[73787]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bz93ie5t' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:34 np0005466012 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 07:42:35 np0005466012 python3.9[73943]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bz93ie5t state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:35 np0005466012 systemd[1]: session-17.scope: Deactivated successfully.
Oct  2 07:42:35 np0005466012 systemd[1]: session-17.scope: Consumed 3.347s CPU time.
Oct  2 07:42:35 np0005466012 systemd-logind[827]: Session 17 logged out. Waiting for processes to exit.
Oct  2 07:42:35 np0005466012 systemd-logind[827]: Removed session 17.
Oct  2 07:42:42 np0005466012 systemd-logind[827]: New session 18 of user zuul.
Oct  2 07:42:42 np0005466012 systemd[1]: Started Session 18 of User zuul.
Oct  2 07:42:43 np0005466012 python3.9[74121]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:44 np0005466012 python3.9[74277]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:42:45 np0005466012 python3.9[74431]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:42:46 np0005466012 python3.9[74584]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:47 np0005466012 python3.9[74737]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:42:47 np0005466012 python3.9[74891]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:42:48 np0005466012 python3.9[75046]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:42:49 np0005466012 systemd[1]: session-18.scope: Deactivated successfully.
Oct  2 07:42:49 np0005466012 systemd[1]: session-18.scope: Consumed 4.296s CPU time.
Oct  2 07:42:49 np0005466012 systemd-logind[827]: Session 18 logged out. Waiting for processes to exit.
Oct  2 07:42:49 np0005466012 systemd-logind[827]: Removed session 18.
Oct  2 07:42:55 np0005466012 systemd-logind[827]: New session 19 of user zuul.
Oct  2 07:42:55 np0005466012 systemd[1]: Started Session 19 of User zuul.
Oct  2 07:42:56 np0005466012 python3.9[75224]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:42:57 np0005466012 python3.9[75380]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:42:58 np0005466012 python3.9[75464]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  2 07:43:00 np0005466012 python3.9[75615]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:43:02 np0005466012 python3.9[75766]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:43:03 np0005466012 python3.9[75916]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:04 np0005466012 python3.9[76066]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:43:04 np0005466012 systemd[1]: session-19.scope: Deactivated successfully.
Oct  2 07:43:04 np0005466012 systemd[1]: session-19.scope: Consumed 5.731s CPU time.
Oct  2 07:43:04 np0005466012 systemd-logind[827]: Session 19 logged out. Waiting for processes to exit.
Oct  2 07:43:04 np0005466012 systemd-logind[827]: Removed session 19.
Oct  2 07:43:11 np0005466012 systemd-logind[827]: New session 20 of user zuul.
Oct  2 07:43:11 np0005466012 systemd[1]: Started Session 20 of User zuul.
Oct  2 07:43:12 np0005466012 python3.9[76244]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:14 np0005466012 python3.9[76400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:15 np0005466012 python3.9[76552]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:16 np0005466012 python3.9[76704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:17 np0005466012 python3.9[76827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405395.6725097-160-192528038144178/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=4e338d46e8ecb274122f0540e99ecce956c10382 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:17 np0005466012 python3.9[76979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:18 np0005466012 python3.9[77102]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405397.5604813-160-162540522491486/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=618966fd8924c3b9caddce17df39815c03c6e5f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:19 np0005466012 python3.9[77254]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:20 np0005466012 python3.9[77377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405398.8636994-160-223591950778702/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=2004d3104ab54726734a1ff3f0ac50fb659b4dfb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:20 np0005466012 python3.9[77529]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:21 np0005466012 python3.9[77681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:22 np0005466012 python3.9[77833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:22 np0005466012 python3.9[77956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405401.6484778-345-233725480008264/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=2963264ceaea9bf7d20689fe0616654399c1360f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:23 np0005466012 python3.9[78108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:23 np0005466012 python3.9[78231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405402.7290483-345-29334178088269/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=cef26c6879264807de4e1e28241ed8a223aa26e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:24 np0005466012 python3.9[78383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:24 np0005466012 python3.9[78506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405403.8114638-345-90746528872400/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=f03acee696603342ae14bdecdbc165d432be53e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:25 np0005466012 python3.9[78658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:25 np0005466012 python3.9[78810]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:26 np0005466012 python3.9[78962]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:27 np0005466012 python3.9[79085]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405406.1450994-518-137357107663446/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=7b921f2594805a8812aa19247cc0c6cc4936b48b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:27 np0005466012 python3.9[79237]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:28 np0005466012 python3.9[79360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405407.2766452-518-196552308021852/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b4329abfe8c8dfc3dff902009782a13facac4ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:28 np0005466012 python3.9[79512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:29 np0005466012 python3.9[79635]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405408.4085486-518-50560250521155/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=8408f0664da284e166692a5a96b3b32c371a60b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:30 np0005466012 python3.9[79787]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:30 np0005466012 python3.9[79939]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:31 np0005466012 python3.9[80091]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:31 np0005466012 python3.9[80214]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405410.9285665-696-202241036794424/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=655f64d0c44e810bc278ec16733a6c88f2e1de60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:32 np0005466012 python3.9[80366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:33 np0005466012 python3.9[80489]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405412.1314447-696-70627850363269/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b4329abfe8c8dfc3dff902009782a13facac4ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:33 np0005466012 python3.9[80641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:34 np0005466012 python3.9[80764]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405413.2489305-696-63053497007634/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a727f42b5cd189eb1b1b1f746efb55132ade9d3a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:35 np0005466012 python3.9[80916]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:36 np0005466012 python3.9[81068]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:36 np0005466012 python3.9[81191]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405415.5702763-900-104052301888516/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:37 np0005466012 python3.9[81343]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:38 np0005466012 python3.9[81495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:38 np0005466012 python3.9[81618]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405417.6142454-973-186840813848623/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:39 np0005466012 python3.9[81770]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:40 np0005466012 python3.9[81922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:41 np0005466012 python3.9[82045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405419.7912202-1052-88023341005811/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:41 np0005466012 python3.9[82197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:42 np0005466012 python3.9[82349]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:43 np0005466012 python3.9[82472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405422.0274377-1130-129484014928552/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:43 np0005466012 python3.9[82624]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:44 np0005466012 python3.9[82776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:44 np0005466012 python3.9[82899]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405423.862001-1205-229789066555969/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:45 np0005466012 python3.9[83051]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:46 np0005466012 python3.9[83203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:46 np0005466012 python3.9[83326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405425.6931827-1280-141544050255641/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:47 np0005466012 python3.9[83478]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:48 np0005466012 python3.9[83630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:43:48 np0005466012 python3.9[83753]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405427.5723512-1351-202877479530638/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=74de1ba89bc28b0be0e3b8a77822f232ede7d253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:43:49 np0005466012 systemd[1]: session-20.scope: Deactivated successfully.
Oct  2 07:43:49 np0005466012 systemd[1]: session-20.scope: Consumed 28.608s CPU time.
Oct  2 07:43:49 np0005466012 systemd-logind[827]: Session 20 logged out. Waiting for processes to exit.
Oct  2 07:43:49 np0005466012 systemd-logind[827]: Removed session 20.
Oct  2 07:43:54 np0005466012 systemd-logind[827]: New session 21 of user zuul.
Oct  2 07:43:54 np0005466012 systemd[1]: Started Session 21 of User zuul.
Oct  2 07:43:55 np0005466012 python3.9[83931]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:56 np0005466012 python3.9[84087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:56 np0005466012 python3.9[84239]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:43:57 np0005466012 python3.9[84389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:43:58 np0005466012 python3.9[84541]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  2 07:44:00 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct  2 07:44:00 np0005466012 python3.9[84697]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:44:01 np0005466012 python3.9[84781]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:44:04 np0005466012 python3.9[84934]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:44:04 np0005466012 python3[85089]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  2 07:44:05 np0005466012 python3.9[85241]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:06 np0005466012 python3.9[85393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:07 np0005466012 python3.9[85471]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:07 np0005466012 python3.9[85623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:08 np0005466012 python3.9[85701]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.58f1l5iw recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:08 np0005466012 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:44:08 np0005466012 python3.9[85855]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:09 np0005466012 python3.9[85933]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:10 np0005466012 python3.9[86085]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:11 np0005466012 python3[86238]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:44:11 np0005466012 python3.9[86390]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:12 np0005466012 python3.9[86515]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405451.449975-437-192666963613922/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:13 np0005466012 python3.9[86667]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:13 np0005466012 python3.9[86792]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405452.8259888-482-230111509505755/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:14 np0005466012 python3.9[86944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:15 np0005466012 python3.9[87069]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405454.108385-527-231731053338264/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:15 np0005466012 python3.9[87221]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:16 np0005466012 python3.9[87346]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405455.3259757-572-2039084413811/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:17 np0005466012 python3.9[87498]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:17 np0005466012 python3.9[87623]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405456.5139108-617-130996763927874/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:18 np0005466012 python3.9[87775]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:19 np0005466012 python3.9[87927]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:19 np0005466012 python3.9[88082]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:20 np0005466012 python3.9[88234]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:21 np0005466012 python3.9[88387]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:22 np0005466012 python3.9[88541]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:23 np0005466012 python3.9[88696]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:24 np0005466012 python3.9[88846]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:44:25 np0005466012 python3.9[88999]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:25 np0005466012 ovs-vsctl[89000]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  2 07:44:26 np0005466012 python3.9[89152]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:27 np0005466012 python3.9[89307]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:27 np0005466012 ovs-vsctl[89308]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  2 07:44:27 np0005466012 python3.9[89458]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:28 np0005466012 python3.9[89612]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:29 np0005466012 python3.9[89765]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:29 np0005466012 python3.9[89843]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:30 np0005466012 python3.9[89995]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:30 np0005466012 python3.9[90073]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:31 np0005466012 python3.9[90225]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:32 np0005466012 python3.9[90377]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:32 np0005466012 python3.9[90455]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:33 np0005466012 python3.9[90607]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:33 np0005466012 python3.9[90685]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:34 np0005466012 python3.9[90837]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:34 np0005466012 systemd[1]: Reloading.
Oct  2 07:44:34 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:34 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:35 np0005466012 python3.9[91028]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:36 np0005466012 python3.9[91106]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:36 np0005466012 python3.9[91258]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:37 np0005466012 python3.9[91336]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:38 np0005466012 python3.9[91488]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:38 np0005466012 systemd[1]: Reloading.
Oct  2 07:44:38 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:38 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:38 np0005466012 systemd[1]: Starting Create netns directory...
Oct  2 07:44:38 np0005466012 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:44:38 np0005466012 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:44:38 np0005466012 systemd[1]: Finished Create netns directory.
Oct  2 07:44:39 np0005466012 python3.9[91683]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:40 np0005466012 python3.9[91835]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:40 np0005466012 python3.9[91958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405479.8799822-1370-134250286237202/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:42 np0005466012 python3.9[92110]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:44:43 np0005466012 python3.9[92262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:44:44 np0005466012 python3.9[92385]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405482.9667425-1445-77670857469145/.source.json _original_basename=.6ej4sygp follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:44 np0005466012 python3.9[92537]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:47 np0005466012 python3.9[92965]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  2 07:44:48 np0005466012 python3.9[93117]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:44:49 np0005466012 python3.9[93269]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:44:49 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:50 np0005466012 python3[93432]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:44:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:51 np0005466012 podman[93469]: 2025-10-02 11:44:51.169546009 +0000 UTC m=+0.020959902 image pull ae232aa720979600656d94fc26ba957f1cdf5bca825fe9b57990f60c6534611f quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:44:51 np0005466012 podman[93469]: 2025-10-02 11:44:51.388082207 +0000 UTC m=+0.239496070 container create fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  2 07:44:51 np0005466012 python3[93432]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  2 07:44:52 np0005466012 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  2 07:44:52 np0005466012 python3.9[93659]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:53 np0005466012 python3.9[93813]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:53 np0005466012 python3.9[93889]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:44:54 np0005466012 python3.9[94040]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405493.6876538-1709-215238687234939/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:44:54 np0005466012 python3.9[94116]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:44:54 np0005466012 systemd[1]: Reloading.
Oct  2 07:44:55 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:55 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:55 np0005466012 python3.9[94227]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:44:55 np0005466012 systemd[1]: Reloading.
Oct  2 07:44:55 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:55 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:56 np0005466012 systemd[1]: Starting ovn_controller container...
Oct  2 07:44:56 np0005466012 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  2 07:44:56 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:44:56 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d713404b2e570d1bff94c2ee5e7ffbf321541e15d5ed5f1353df9a844de23dba/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:44:56 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502.
Oct  2 07:44:56 np0005466012 podman[94268]: 2025-10-02 11:44:56.354648129 +0000 UTC m=+0.191631046 container init fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + sudo -E kolla_set_configs
Oct  2 07:44:56 np0005466012 podman[94268]: 2025-10-02 11:44:56.394406132 +0000 UTC m=+0.231389039 container start fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 07:44:56 np0005466012 edpm-start-podman-container[94268]: ovn_controller
Oct  2 07:44:56 np0005466012 systemd[1]: Created slice User Slice of UID 0.
Oct  2 07:44:56 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  2 07:44:56 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  2 07:44:56 np0005466012 systemd[1]: Starting User Manager for UID 0...
Oct  2 07:44:56 np0005466012 edpm-start-podman-container[94267]: Creating additional drop-in dependency for "ovn_controller" (fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502)
Oct  2 07:44:56 np0005466012 podman[94291]: 2025-10-02 11:44:56.481504045 +0000 UTC m=+0.071318461 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  2 07:44:56 np0005466012 systemd[1]: fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502-34da0b037c543bd6.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:44:56 np0005466012 systemd[1]: fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502-34da0b037c543bd6.service: Failed with result 'exit-code'.
Oct  2 07:44:56 np0005466012 systemd[1]: Reloading.
Oct  2 07:44:56 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:44:56 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:44:56 np0005466012 systemd[94321]: Queued start job for default target Main User Target.
Oct  2 07:44:56 np0005466012 systemd[94321]: Created slice User Application Slice.
Oct  2 07:44:56 np0005466012 systemd[94321]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  2 07:44:56 np0005466012 systemd[94321]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:44:56 np0005466012 systemd[94321]: Reached target Paths.
Oct  2 07:44:56 np0005466012 systemd[94321]: Reached target Timers.
Oct  2 07:44:56 np0005466012 systemd[94321]: Starting D-Bus User Message Bus Socket...
Oct  2 07:44:56 np0005466012 systemd[94321]: Starting Create User's Volatile Files and Directories...
Oct  2 07:44:56 np0005466012 systemd[94321]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:44:56 np0005466012 systemd[94321]: Reached target Sockets.
Oct  2 07:44:56 np0005466012 systemd[94321]: Finished Create User's Volatile Files and Directories.
Oct  2 07:44:56 np0005466012 systemd[94321]: Reached target Basic System.
Oct  2 07:44:56 np0005466012 systemd[94321]: Reached target Main User Target.
Oct  2 07:44:56 np0005466012 systemd[94321]: Startup finished in 133ms.
Oct  2 07:44:56 np0005466012 systemd[1]: Started User Manager for UID 0.
Oct  2 07:44:56 np0005466012 systemd[1]: Started ovn_controller container.
Oct  2 07:44:56 np0005466012 systemd[1]: Started Session c1 of User root.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: INFO:__main__:Validating config file
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: INFO:__main__:Writing out command to execute
Oct  2 07:44:56 np0005466012 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: ++ cat /run_command
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + ARGS=
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + sudo kolla_copy_cacerts
Oct  2 07:44:56 np0005466012 systemd[1]: Started Session c2 of User root.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + [[ ! -n '' ]]
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + . kolla_extend_start
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + umask 0022
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  2 07:44:56 np0005466012 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8455] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8461] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8469] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8472] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8475] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 07:44:56 np0005466012 kernel: br-int: entered promiscuous mode
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  2 07:44:56 np0005466012 systemd-udevd[94395]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466012 ovn_controller[94284]: 2025-10-02T11:44:56Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8838] manager: (ovn-3ff68c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.8844] manager: (ovn-1fc220-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Oct  2 07:44:56 np0005466012 kernel: genev_sys_6081: entered promiscuous mode
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.9204] device (genev_sys_6081): carrier: link connected
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.9208] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Oct  2 07:44:56 np0005466012 NetworkManager[51207]: <info>  [1759405496.9322] manager: (ovn-c9f3d6-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Oct  2 07:44:57 np0005466012 python3.9[94551]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:57 np0005466012 ovs-vsctl[94552]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  2 07:44:58 np0005466012 python3.9[94704]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:58 np0005466012 ovs-vsctl[94706]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  2 07:44:59 np0005466012 python3.9[94859]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:44:59 np0005466012 ovs-vsctl[94860]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  2 07:44:59 np0005466012 systemd[1]: session-21.scope: Deactivated successfully.
Oct  2 07:44:59 np0005466012 systemd[1]: session-21.scope: Consumed 43.784s CPU time.
Oct  2 07:44:59 np0005466012 systemd-logind[827]: Session 21 logged out. Waiting for processes to exit.
Oct  2 07:44:59 np0005466012 systemd-logind[827]: Removed session 21.
Oct  2 07:45:04 np0005466012 systemd-logind[827]: New session 23 of user zuul.
Oct  2 07:45:04 np0005466012 systemd[1]: Started Session 23 of User zuul.
Oct  2 07:45:06 np0005466012 python3.9[95038]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:45:06 np0005466012 systemd[1]: Stopping User Manager for UID 0...
Oct  2 07:45:06 np0005466012 systemd[94321]: Activating special unit Exit the Session...
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped target Main User Target.
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped target Basic System.
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped target Paths.
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped target Sockets.
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped target Timers.
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 07:45:06 np0005466012 systemd[94321]: Closed D-Bus User Message Bus Socket.
Oct  2 07:45:06 np0005466012 systemd[94321]: Stopped Create User's Volatile Files and Directories.
Oct  2 07:45:06 np0005466012 systemd[94321]: Removed slice User Application Slice.
Oct  2 07:45:06 np0005466012 systemd[94321]: Reached target Shutdown.
Oct  2 07:45:06 np0005466012 systemd[94321]: Finished Exit the Session.
Oct  2 07:45:06 np0005466012 systemd[94321]: Reached target Exit the Session.
Oct  2 07:45:06 np0005466012 systemd[1]: user@0.service: Deactivated successfully.
Oct  2 07:45:06 np0005466012 systemd[1]: Stopped User Manager for UID 0.
Oct  2 07:45:06 np0005466012 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  2 07:45:06 np0005466012 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  2 07:45:06 np0005466012 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  2 07:45:06 np0005466012 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  2 07:45:06 np0005466012 systemd[1]: Removed slice User Slice of UID 0.
Oct  2 07:45:07 np0005466012 python3.9[95196]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:07 np0005466012 python3.9[95348]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:08 np0005466012 python3.9[95500]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:08 np0005466012 python3.9[95652]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:09 np0005466012 python3.9[95804]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:10 np0005466012 python3.9[95954]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:45:11 np0005466012 python3.9[96106]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  2 07:45:13 np0005466012 python3.9[96256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:13 np0005466012 python3.9[96377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405512.368368-224-20745326777077/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:14 np0005466012 python3.9[96527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:14 np0005466012 python3.9[96648]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405514.0469625-269-134479577506066/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:16 np0005466012 python3.9[96801]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:45:16 np0005466012 python3.9[96885]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:45:19 np0005466012 python3.9[97038]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:45:20 np0005466012 python3.9[97191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:20 np0005466012 python3.9[97312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405519.7539337-380-2418059867954/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:21 np0005466012 python3.9[97462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:21 np0005466012 python3.9[97583]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405520.8065054-380-69678208316309/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:23 np0005466012 python3.9[97733]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:23 np0005466012 python3.9[97854]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405522.8217864-512-266139095573672/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:24 np0005466012 python3.9[98004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:24 np0005466012 python3.9[98125]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405523.9163141-512-39246918917906/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:25 np0005466012 python3.9[98275]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:45:26 np0005466012 python3.9[98429]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:27 np0005466012 ovn_controller[94284]: 2025-10-02T11:45:27Z|00025|memory|INFO|16128 kB peak resident set size after 30.3 seconds
Oct  2 07:45:27 np0005466012 ovn_controller[94284]: 2025-10-02T11:45:27Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct  2 07:45:27 np0005466012 podman[98553]: 2025-10-02 11:45:27.22161082 +0000 UTC m=+0.146176300 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 07:45:27 np0005466012 python3.9[98592]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:27 np0005466012 python3.9[98687]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:28 np0005466012 python3.9[98839]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:28 np0005466012 python3.9[98917]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:29 np0005466012 python3.9[99069]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:30 np0005466012 python3.9[99221]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:30 np0005466012 python3.9[99299]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:31 np0005466012 python3.9[99451]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:31 np0005466012 python3.9[99529]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:32 np0005466012 python3.9[99681]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:45:32 np0005466012 systemd[1]: Reloading.
Oct  2 07:45:32 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:32 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:33 np0005466012 python3.9[99871]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:34 np0005466012 python3.9[99949]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:34 np0005466012 python3.9[100101]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:35 np0005466012 python3.9[100179]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:36 np0005466012 python3.9[100331]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:45:36 np0005466012 systemd[1]: Reloading.
Oct  2 07:45:36 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:36 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:37 np0005466012 systemd[1]: Starting Create netns directory...
Oct  2 07:45:37 np0005466012 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:45:37 np0005466012 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:45:37 np0005466012 systemd[1]: Finished Create netns directory.
Oct  2 07:45:38 np0005466012 python3.9[100524]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:39 np0005466012 python3.9[100676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:39 np0005466012 python3.9[100799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405538.5021532-965-59364844561655/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:40 np0005466012 python3.9[100951]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:45:41 np0005466012 python3.9[101103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:45:41 np0005466012 python3.9[101226]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405540.979787-1040-43305682508250/.source.json _original_basename=.2wg1hyep follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:42 np0005466012 python3.9[101378]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:45 np0005466012 python3.9[101805]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  2 07:45:46 np0005466012 python3.9[101957]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:45:47 np0005466012 python3.9[102109]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:45:48 np0005466012 python3[102288]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:45:54 np0005466012 podman[102301]: 2025-10-02 11:45:54.745827422 +0000 UTC m=+6.113593720 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:45:54 np0005466012 podman[102399]: 2025-10-02 11:45:54.87916488 +0000 UTC m=+0.050084191 container create 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct  2 07:45:54 np0005466012 podman[102399]: 2025-10-02 11:45:54.850244624 +0000 UTC m=+0.021163965 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:45:54 np0005466012 python3[102288]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 07:45:56 np0005466012 python3.9[102589]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:45:57 np0005466012 python3.9[102743]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:57 np0005466012 podman[102791]: 2025-10-02 11:45:57.622661289 +0000 UTC m=+0.090043305 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:45:57 np0005466012 python3.9[102837]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:45:58 np0005466012 python3.9[102997]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405557.8367429-1304-178603424879287/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:45:58 np0005466012 python3.9[103073]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:45:58 np0005466012 systemd[1]: Reloading.
Oct  2 07:45:59 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:45:59 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:59 np0005466012 python3.9[103184]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:45:59 np0005466012 systemd[1]: Reloading.
Oct  2 07:45:59 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:45:59 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:00 np0005466012 systemd[1]: Starting ovn_metadata_agent container...
Oct  2 07:46:00 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:46:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80f072a429b2cb34a24f19f33f538c7874bf4d5be5be5fe77e7552fd504fd2ac/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  2 07:46:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80f072a429b2cb34a24f19f33f538c7874bf4d5be5be5fe77e7552fd504fd2ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 07:46:00 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf.
Oct  2 07:46:00 np0005466012 podman[103225]: 2025-10-02 11:46:00.241246917 +0000 UTC m=+0.168149195 container init 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + sudo -E kolla_set_configs
Oct  2 07:46:00 np0005466012 podman[103225]: 2025-10-02 11:46:00.268114651 +0000 UTC m=+0.195016889 container start 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 07:46:00 np0005466012 edpm-start-podman-container[103225]: ovn_metadata_agent
Oct  2 07:46:00 np0005466012 edpm-start-podman-container[103224]: Creating additional drop-in dependency for "ovn_metadata_agent" (9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf)
Oct  2 07:46:00 np0005466012 podman[103248]: 2025-10-02 11:46:00.329536402 +0000 UTC m=+0.047288786 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  2 07:46:00 np0005466012 systemd[1]: Reloading.
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Validating config file
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Copying service configuration files
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Writing out command to execute
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: ++ cat /run_command
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + CMD=neutron-ovn-metadata-agent
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + ARGS=
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + sudo kolla_copy_cacerts
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + [[ ! -n '' ]]
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + . kolla_extend_start
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: Running command: 'neutron-ovn-metadata-agent'
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + umask 0022
Oct  2 07:46:00 np0005466012 ovn_metadata_agent[103241]: + exec neutron-ovn-metadata-agent
Oct  2 07:46:00 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:00 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:00 np0005466012 systemd[1]: Started ovn_metadata_agent container.
Oct  2 07:46:01 np0005466012 systemd[1]: session-23.scope: Deactivated successfully.
Oct  2 07:46:01 np0005466012 systemd[1]: session-23.scope: Consumed 48.026s CPU time.
Oct  2 07:46:01 np0005466012 systemd-logind[827]: Session 23 logged out. Waiting for processes to exit.
Oct  2 07:46:01 np0005466012 systemd-logind[827]: Removed session 23.
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.060 103246 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.060 103246 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.061 103246 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.061 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.061 103246 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.061 103246 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.061 103246 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.061 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.062 103246 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.063 103246 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.064 103246 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.065 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.066 103246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.067 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.068 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.069 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.070 103246 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.071 103246 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.072 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.073 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.074 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.075 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.076 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.077 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.078 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.079 103246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.080 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.081 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.082 103246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.083 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.084 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.085 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.086 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.087 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.088 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.089 103246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.090 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.091 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.092 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.093 103246 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.102 103246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.102 103246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.102 103246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.102 103246 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.102 103246 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.114 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec (UUID: ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.145 103246 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.145 103246 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.145 103246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.145 103246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.159 103246 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.167 103246 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.173 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], external_ids={}, name=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, nb_cfg_timestamp=1759405504869, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.174 103246 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f2412206bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.175 103246 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.179 103246 DEBUG oslo_service.service [-] Started child 103354 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.182 103246 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpb4q7rznn/privsep.sock']#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.182 103354 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-422573'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.207 103354 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.208 103354 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.208 103354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.214 103354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.221 103354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.229 103354 INFO eventlet.wsgi.server [-] (103354) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  2 07:46:02 np0005466012 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.771 103246 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.772 103246 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpb4q7rznn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.667 103359 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.670 103359 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.672 103359 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.672 103359 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103359#033[00m
Oct  2 07:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:02.774 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[187e0079-b85c-4178-9731-189e0f122407]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.219 103359 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.219 103359 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.219 103359 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.736 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[64cb49b0-ace9-4849-873f-55ab485a2a27]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.739 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, column=external_ids, values=({'neutron:ovn-metadata-id': '5d6f510d-a451-5a4c-a5a2-252ae9fdeb8c'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.752 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.763 103246 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.763 103246 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.763 103246 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.763 103246 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.763 103246 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.763 103246 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.764 103246 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.764 103246 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.764 103246 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.764 103246 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.765 103246 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.765 103246 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.765 103246 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.765 103246 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.765 103246 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.766 103246 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.766 103246 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.766 103246 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.766 103246 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.766 103246 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.766 103246 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.767 103246 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.767 103246 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.767 103246 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.767 103246 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.767 103246 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.768 103246 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.768 103246 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.768 103246 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.768 103246 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.768 103246 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.768 103246 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.769 103246 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.769 103246 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.769 103246 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.769 103246 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.770 103246 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.770 103246 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.770 103246 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.770 103246 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.771 103246 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.771 103246 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.771 103246 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.771 103246 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.771 103246 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.772 103246 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.772 103246 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.772 103246 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.772 103246 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.772 103246 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.773 103246 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.774 103246 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.775 103246 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.775 103246 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.775 103246 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.775 103246 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.775 103246 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.776 103246 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.776 103246 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.776 103246 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.776 103246 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.776 103246 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.776 103246 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.777 103246 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.777 103246 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.777 103246 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.777 103246 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.777 103246 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.777 103246 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.778 103246 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.778 103246 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.778 103246 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.778 103246 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.778 103246 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.779 103246 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.779 103246 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.779 103246 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.779 103246 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.779 103246 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.780 103246 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.781 103246 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.781 103246 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.781 103246 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.781 103246 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.781 103246 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.781 103246 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.782 103246 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.782 103246 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.782 103246 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.782 103246 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.782 103246 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.783 103246 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.783 103246 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.783 103246 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.783 103246 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.784 103246 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.784 103246 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.784 103246 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.784 103246 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.784 103246 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.784 103246 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.785 103246 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.785 103246 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.785 103246 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.785 103246 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.785 103246 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.785 103246 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.786 103246 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.786 103246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.786 103246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.786 103246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.786 103246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.786 103246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.787 103246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.787 103246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.787 103246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.787 103246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.787 103246 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.787 103246 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.788 103246 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.789 103246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.790 103246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.790 103246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.790 103246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.790 103246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.790 103246 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.790 103246 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.791 103246 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.792 103246 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.793 103246 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.793 103246 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.793 103246 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.793 103246 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.793 103246 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.793 103246 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.794 103246 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.794 103246 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.794 103246 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.794 103246 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.794 103246 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.794 103246 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.795 103246 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.796 103246 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.796 103246 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.796 103246 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.796 103246 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.796 103246 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.797 103246 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.797 103246 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.797 103246 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.797 103246 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.797 103246 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.797 103246 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.798 103246 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.799 103246 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.800 103246 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.801 103246 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.802 103246 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.803 103246 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.803 103246 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.803 103246 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.803 103246 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.803 103246 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.803 103246 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.804 103246 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.805 103246 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.806 103246 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.807 103246 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.807 103246 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.807 103246 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.807 103246 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.807 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.807 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.808 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.809 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.809 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.809 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.809 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.809 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.809 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.810 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.811 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:46:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:46:03.812 103246 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:46:06 np0005466012 systemd-logind[827]: New session 24 of user zuul.
Oct  2 07:46:06 np0005466012 systemd[1]: Started Session 24 of User zuul.
Oct  2 07:46:07 np0005466012 python3.9[103517]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:46:08 np0005466012 python3.9[103673]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:09 np0005466012 python3.9[103838]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:46:09 np0005466012 systemd[1]: Reloading.
Oct  2 07:46:09 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:09 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:10 np0005466012 python3.9[104023]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:46:10 np0005466012 network[104040]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:46:10 np0005466012 network[104041]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:46:10 np0005466012 network[104042]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:46:18 np0005466012 python3.9[104317]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:18 np0005466012 python3.9[104470]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:19 np0005466012 python3.9[104623]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:20 np0005466012 python3.9[104776]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:21 np0005466012 python3.9[104929]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:22 np0005466012 python3.9[105082]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:23 np0005466012 python3.9[105235]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:46:24 np0005466012 python3.9[105388]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:24 np0005466012 python3.9[105540]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:25 np0005466012 python3.9[105692]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:26 np0005466012 python3.9[105844]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:27 np0005466012 python3.9[105996]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:27 np0005466012 podman[106148]: 2025-10-02 11:46:27.870002106 +0000 UTC m=+0.186382938 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:46:27 np0005466012 python3.9[106149]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:28 np0005466012 python3.9[106326]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:29 np0005466012 python3.9[106478]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:29 np0005466012 python3.9[106630]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:30 np0005466012 python3.9[106782]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:31 np0005466012 podman[106906]: 2025-10-02 11:46:31.116342452 +0000 UTC m=+0.051367406 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 07:46:31 np0005466012 python3.9[106953]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:31 np0005466012 python3.9[107105]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:32 np0005466012 python3.9[107257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:33 np0005466012 python3.9[107409]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:46:34 np0005466012 python3.9[107561]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:34 np0005466012 python3.9[107713]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:46:35 np0005466012 python3.9[107865]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:46:35 np0005466012 systemd[1]: Reloading.
Oct  2 07:46:36 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:46:36 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:46:37 np0005466012 python3.9[108052]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:37 np0005466012 python3.9[108205]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:38 np0005466012 python3.9[108358]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:39 np0005466012 python3.9[108511]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:39 np0005466012 python3.9[108664]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:40 np0005466012 python3.9[108817]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:41 np0005466012 python3.9[108970]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:46:43 np0005466012 python3.9[109125]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  2 07:46:44 np0005466012 python3.9[109278]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:46:45 np0005466012 python3.9[109436]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:46:46 np0005466012 python3.9[109596]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:46:47 np0005466012 python3.9[109680]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:46:58 np0005466012 podman[109743]: 2025-10-02 11:46:58.170560389 +0000 UTC m=+0.084524960 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:47:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:47:02.096 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:47:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:47:02.096 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:47:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:47:02.096 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:47:02 np0005466012 podman[109890]: 2025-10-02 11:47:02.166488089 +0000 UTC m=+0.072692465 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 07:47:16 np0005466012 kernel: SELinux:  Converting 2752 SID table entries...
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:47:16 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:47:25 np0005466012 kernel: SELinux:  Converting 2752 SID table entries...
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:47:25 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:47:29 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct  2 07:47:29 np0005466012 podman[109931]: 2025-10-02 11:47:29.203409307 +0000 UTC m=+0.100148493 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:47:33 np0005466012 podman[109957]: 2025-10-02 11:47:33.128495086 +0000 UTC m=+0.047087487 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:48:00 np0005466012 podman[123084]: 2025-10-02 11:48:00.227646527 +0000 UTC m=+0.136236423 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:48:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:48:02.097 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:48:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:48:02.097 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:48:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:48:02.098 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:48:04 np0005466012 podman[126016]: 2025-10-02 11:48:04.128526739 +0000 UTC m=+0.049925521 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:48:17 np0005466012 kernel: SELinux:  Converting 2753 SID table entries...
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability network_peer_controls=1
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability open_perms=1
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability extended_socket_class=1
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability always_check_network=0
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  2 07:48:17 np0005466012 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  2 07:48:19 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:48:19 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct  2 07:48:19 np0005466012 dbus-broker-launch[817]: Noticed file-system modification, trigger reload.
Oct  2 07:48:26 np0005466012 systemd[1]: Stopping OpenSSH server daemon...
Oct  2 07:48:26 np0005466012 systemd[1]: sshd.service: Deactivated successfully.
Oct  2 07:48:26 np0005466012 systemd[1]: Stopped OpenSSH server daemon.
Oct  2 07:48:26 np0005466012 systemd[1]: sshd.service: Consumed 1.729s CPU time, read 532.0K from disk, written 20.0K to disk.
Oct  2 07:48:26 np0005466012 systemd[1]: Stopped target sshd-keygen.target.
Oct  2 07:48:26 np0005466012 systemd[1]: Stopping sshd-keygen.target...
Oct  2 07:48:26 np0005466012 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:48:26 np0005466012 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:48:26 np0005466012 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  2 07:48:26 np0005466012 systemd[1]: Reached target sshd-keygen.target.
Oct  2 07:48:26 np0005466012 systemd[1]: Starting OpenSSH server daemon...
Oct  2 07:48:26 np0005466012 systemd[1]: Started OpenSSH server daemon.
Oct  2 07:48:28 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:48:28 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:48:28 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:28 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:28 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:28 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:48:31 np0005466012 podman[131397]: 2025-10-02 11:48:31.204649578 +0000 UTC m=+0.108807678 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 07:48:32 np0005466012 systemd[1]: Starting PackageKit Daemon...
Oct  2 07:48:32 np0005466012 systemd[1]: Started PackageKit Daemon.
Oct  2 07:48:35 np0005466012 podman[135578]: 2025-10-02 11:48:35.132934254 +0000 UTC m=+0.054875711 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 07:48:35 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:48:35 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:48:35 np0005466012 systemd[1]: man-db-cache-update.service: Consumed 9.474s CPU time.
Oct  2 07:48:35 np0005466012 systemd[1]: run-r9d3a9073e8264dfc9d27e0ce6f2bdb79.service: Deactivated successfully.
Oct  2 07:48:41 np0005466012 python3.9[136232]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:41 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:41 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:41 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:42 np0005466012 python3.9[136421]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:42 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:42 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:42 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:43 np0005466012 python3.9[136611]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:43 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:43 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:43 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:44 np0005466012 python3.9[136802]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:44 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:44 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:44 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:46 np0005466012 python3.9[136992]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:46 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:46 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:46 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:48 np0005466012 python3.9[137182]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:48 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:48 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:48 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:49 np0005466012 python3.9[137371]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:49 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:49 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:49 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:50 np0005466012 python3.9[137560]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:50 np0005466012 python3.9[137715]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:51 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:51 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:51 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:52 np0005466012 python3.9[137905]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  2 07:48:52 np0005466012 systemd[1]: Reloading.
Oct  2 07:48:52 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:48:52 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:48:52 np0005466012 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  2 07:48:52 np0005466012 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  2 07:48:53 np0005466012 python3.9[138098]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:54 np0005466012 python3.9[138253]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:54 np0005466012 python3.9[138408]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:55 np0005466012 python3.9[138563]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:56 np0005466012 python3.9[138718]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:57 np0005466012 python3.9[138873]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:58 np0005466012 python3.9[139028]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:59 np0005466012 python3.9[139183]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:48:59 np0005466012 python3.9[139338]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:00 np0005466012 python3.9[139493]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:01 np0005466012 podman[139620]: 2025-10-02 11:49:01.374581525 +0000 UTC m=+0.104430504 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:49:01 np0005466012 python3.9[139667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:49:02.098 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:49:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:49:02.098 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:49:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:49:02.098 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:49:02 np0005466012 python3.9[139829]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:03 np0005466012 python3.9[139984]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:03 np0005466012 python3.9[140139]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  2 07:49:05 np0005466012 python3.9[140294]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:05 np0005466012 podman[140420]: 2025-10-02 11:49:05.616699927 +0000 UTC m=+0.055719440 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 07:49:05 np0005466012 python3.9[140467]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:06 np0005466012 python3.9[140619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:06 np0005466012 python3.9[140771]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:07 np0005466012 python3.9[140923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:08 np0005466012 python3.9[141075]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:49:09 np0005466012 python3.9[141227]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:10 np0005466012 python3.9[141352]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405748.3926136-1628-234562117420363/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:10 np0005466012 python3.9[141504]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:11 np0005466012 python3.9[141629]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405750.212731-1628-267135816657125/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:12 np0005466012 python3.9[141781]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:12 np0005466012 python3.9[141906]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405751.4295697-1628-133419125971436/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:13 np0005466012 python3.9[142058]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:13 np0005466012 python3.9[142183]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405752.743378-1628-247734553255834/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:14 np0005466012 python3.9[142335]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:14 np0005466012 python3.9[142460]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405753.9675422-1628-183347002619656/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:15 np0005466012 python3.9[142612]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:16 np0005466012 python3.9[142737]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405755.1088305-1628-126948468187621/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:16 np0005466012 python3.9[142889]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:17 np0005466012 python3.9[143012]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405756.271102-1628-196051064708445/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:17 np0005466012 python3.9[143164]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:18 np0005466012 python3.9[143289]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759405757.445234-1628-14665832142943/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:19 np0005466012 python3.9[143441]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  2 07:49:20 np0005466012 python3.9[143594]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:20 np0005466012 python3.9[143746]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:21 np0005466012 python3.9[143898]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:21 np0005466012 python3.9[144050]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:22 np0005466012 python3.9[144202]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:23 np0005466012 python3.9[144354]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:24 np0005466012 python3.9[144506]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:24 np0005466012 python3.9[144658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:25 np0005466012 python3.9[144810]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:25 np0005466012 python3.9[144962]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:26 np0005466012 python3.9[145114]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:26 np0005466012 python3.9[145266]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:27 np0005466012 python3.9[145418]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:28 np0005466012 python3.9[145570]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:29 np0005466012 python3.9[145722]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:29 np0005466012 python3.9[145845]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405768.5443258-2291-76036240230625/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:30 np0005466012 python3.9[145997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:31 np0005466012 python3.9[146120]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405769.9976194-2291-129539092512113/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:31 np0005466012 podman[146272]: 2025-10-02 11:49:31.501955764 +0000 UTC m=+0.074092577 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Oct  2 07:49:31 np0005466012 python3.9[146273]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:32 np0005466012 python3.9[146421]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405771.1696496-2291-275242439286679/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:32 np0005466012 python3.9[146573]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:33 np0005466012 python3.9[146696]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405772.285631-2291-185134886223224/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:34 np0005466012 python3.9[146848]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:34 np0005466012 python3.9[146971]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405773.574781-2291-78426262504416/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:35 np0005466012 python3.9[147123]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:35 np0005466012 python3.9[147246]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405774.7228928-2291-31529388695607/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:36 np0005466012 podman[147346]: 2025-10-02 11:49:36.137125979 +0000 UTC m=+0.050329660 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 07:49:36 np0005466012 python3.9[147418]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:36 np0005466012 python3.9[147541]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405775.9192622-2291-189199398486464/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:37 np0005466012 python3.9[147693]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:38 np0005466012 python3.9[147816]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405777.0466762-2291-86135076103436/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:38 np0005466012 python3.9[147968]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:39 np0005466012 python3.9[148091]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405778.1561685-2291-103598968930368/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:39 np0005466012 python3.9[148243]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:40 np0005466012 python3.9[148366]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405779.2803905-2291-141406119475411/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:40 np0005466012 python3.9[148518]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:41 np0005466012 python3.9[148641]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405780.370473-2291-240148751549625/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:41 np0005466012 python3.9[148793]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:42 np0005466012 python3.9[148916]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405781.5247388-2291-276798958949594/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:43 np0005466012 python3.9[149068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:43 np0005466012 python3.9[149191]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405782.6614697-2291-101360405747053/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:44 np0005466012 python3.9[149343]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:49:44 np0005466012 python3.9[149466]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405783.7447345-2291-38183346406744/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:45 np0005466012 python3.9[149616]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:49:46 np0005466012 python3.9[149771]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  2 07:49:48 np0005466012 dbus-broker-launch[818]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct  2 07:49:48 np0005466012 python3.9[149927]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:49 np0005466012 python3.9[150079]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:49 np0005466012 python3.9[150231]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:50 np0005466012 python3.9[150383]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:50 np0005466012 python3.9[150535]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:52 np0005466012 python3.9[150687]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:53 np0005466012 python3.9[150839]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:53 np0005466012 python3.9[150991]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:54 np0005466012 python3.9[151143]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:54 np0005466012 python3.9[151295]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:49:55 np0005466012 python3.9[151447]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:55 np0005466012 systemd[1]: Reloading.
Oct  2 07:49:55 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:55 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:55 np0005466012 systemd[1]: Starting libvirt logging daemon socket...
Oct  2 07:49:55 np0005466012 systemd[1]: Listening on libvirt logging daemon socket.
Oct  2 07:49:55 np0005466012 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  2 07:49:55 np0005466012 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  2 07:49:55 np0005466012 systemd[1]: Starting libvirt logging daemon...
Oct  2 07:49:56 np0005466012 systemd[1]: Started libvirt logging daemon.
Oct  2 07:49:56 np0005466012 python3.9[151640]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:56 np0005466012 systemd[1]: Reloading.
Oct  2 07:49:56 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:56 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:57 np0005466012 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  2 07:49:57 np0005466012 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  2 07:49:57 np0005466012 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  2 07:49:57 np0005466012 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  2 07:49:57 np0005466012 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  2 07:49:57 np0005466012 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  2 07:49:57 np0005466012 systemd[1]: Starting libvirt nodedev daemon...
Oct  2 07:49:57 np0005466012 systemd[1]: Started libvirt nodedev daemon.
Oct  2 07:49:57 np0005466012 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  2 07:49:57 np0005466012 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  2 07:49:57 np0005466012 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  2 07:49:57 np0005466012 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  2 07:49:57 np0005466012 python3.9[151856]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:57 np0005466012 systemd[1]: Reloading.
Oct  2 07:49:57 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:57 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:58 np0005466012 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  2 07:49:58 np0005466012 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  2 07:49:58 np0005466012 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  2 07:49:58 np0005466012 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  2 07:49:58 np0005466012 systemd[1]: Starting libvirt proxy daemon...
Oct  2 07:49:58 np0005466012 systemd[1]: Started libvirt proxy daemon.
Oct  2 07:49:58 np0005466012 setroubleshoot[151700]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4fc1ed3a-4c62-40bc-9704-f069a09cd715
Oct  2 07:49:58 np0005466012 setroubleshoot[151700]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  2 07:49:58 np0005466012 setroubleshoot[151700]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4fc1ed3a-4c62-40bc-9704-f069a09cd715
Oct  2 07:49:58 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:49:58 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:49:58 np0005466012 setroubleshoot[151700]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  2 07:49:58 np0005466012 python3.9[152075]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:58 np0005466012 systemd[1]: Reloading.
Oct  2 07:49:58 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:49:58 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:49:59 np0005466012 systemd[1]: Listening on libvirt locking daemon socket.
Oct  2 07:49:59 np0005466012 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  2 07:49:59 np0005466012 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  2 07:49:59 np0005466012 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  2 07:49:59 np0005466012 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  2 07:49:59 np0005466012 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  2 07:49:59 np0005466012 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  2 07:49:59 np0005466012 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  2 07:49:59 np0005466012 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  2 07:49:59 np0005466012 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  2 07:49:59 np0005466012 systemd[1]: Starting libvirt QEMU daemon...
Oct  2 07:49:59 np0005466012 systemd[1]: Started libvirt QEMU daemon.
Oct  2 07:49:59 np0005466012 python3.9[152287]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:49:59 np0005466012 systemd[1]: Reloading.
Oct  2 07:50:00 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:00 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:00 np0005466012 systemd[1]: Starting libvirt secret daemon socket...
Oct  2 07:50:00 np0005466012 systemd[1]: Listening on libvirt secret daemon socket.
Oct  2 07:50:00 np0005466012 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  2 07:50:00 np0005466012 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  2 07:50:00 np0005466012 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  2 07:50:00 np0005466012 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  2 07:50:00 np0005466012 systemd[1]: Starting libvirt secret daemon...
Oct  2 07:50:00 np0005466012 systemd[1]: Started libvirt secret daemon.
Oct  2 07:50:01 np0005466012 python3.9[152497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:50:02.098 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:50:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:50:02.099 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:50:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:50:02.099 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:50:02 np0005466012 podman[152621]: 2025-10-02 11:50:02.13952865 +0000 UTC m=+0.114171462 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:50:02 np0005466012 python3.9[152669]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:50:03 np0005466012 python3.9[152828]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:03 np0005466012 python3.9[152951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405802.8896916-3326-44483924368656/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:04 np0005466012 python3.9[153103]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:05 np0005466012 python3.9[153255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:06 np0005466012 python3.9[153333]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:06 np0005466012 podman[153457]: 2025-10-02 11:50:06.622403808 +0000 UTC m=+0.066385566 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  2 07:50:06 np0005466012 python3.9[153504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:07 np0005466012 python3.9[153582]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xr0tmuw8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:07 np0005466012 python3.9[153734]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:08 np0005466012 python3.9[153812]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:08 np0005466012 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  2 07:50:08 np0005466012 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  2 07:50:09 np0005466012 python3.9[153964]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:10 np0005466012 python3[154117]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:50:10 np0005466012 python3.9[154269]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:11 np0005466012 python3.9[154347]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:12 np0005466012 python3.9[154499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:12 np0005466012 python3.9[154577]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:13 np0005466012 python3.9[154729]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:13 np0005466012 python3.9[154807]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:14 np0005466012 python3.9[154959]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:14 np0005466012 python3.9[155037]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:15 np0005466012 python3.9[155189]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:16 np0005466012 python3.9[155314]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759405815.1823018-3701-31387110118993/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:17 np0005466012 python3.9[155466]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:17 np0005466012 python3.9[155618]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:18 np0005466012 python3.9[155773]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:19 np0005466012 python3.9[155925]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:20 np0005466012 python3.9[156078]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:20 np0005466012 python3.9[156232]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:50:21 np0005466012 python3.9[156387]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:22 np0005466012 python3.9[156539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:22 np0005466012 python3.9[156662]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405821.858067-3917-181537656039125/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:23 np0005466012 python3.9[156814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:24 np0005466012 python3.9[156937]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405823.1086516-3963-259812875207381/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:24 np0005466012 python3.9[157089]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:25 np0005466012 python3.9[157212]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405824.3851626-4007-252856901321769/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:26 np0005466012 python3.9[157364]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:26 np0005466012 systemd[1]: Reloading.
Oct  2 07:50:26 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:26 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:26 np0005466012 systemd[1]: Reached target edpm_libvirt.target.
Oct  2 07:50:28 np0005466012 python3.9[157555]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  2 07:50:28 np0005466012 systemd[1]: Reloading.
Oct  2 07:50:28 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:28 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:28 np0005466012 systemd[1]: Reloading.
Oct  2 07:50:28 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:28 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:29 np0005466012 systemd[1]: session-24.scope: Deactivated successfully.
Oct  2 07:50:29 np0005466012 systemd[1]: session-24.scope: Consumed 3min 14.645s CPU time.
Oct  2 07:50:29 np0005466012 systemd-logind[827]: Session 24 logged out. Waiting for processes to exit.
Oct  2 07:50:29 np0005466012 systemd-logind[827]: Removed session 24.
Oct  2 07:50:33 np0005466012 podman[157652]: 2025-10-02 11:50:33.184467237 +0000 UTC m=+0.095728337 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:50:35 np0005466012 systemd-logind[827]: New session 25 of user zuul.
Oct  2 07:50:35 np0005466012 systemd[1]: Started Session 25 of User zuul.
Oct  2 07:50:36 np0005466012 python3.9[157832]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:50:37 np0005466012 podman[157861]: 2025-10-02 11:50:37.143430892 +0000 UTC m=+0.063824842 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 07:50:37 np0005466012 python3.9[158009]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:38 np0005466012 python3.9[158161]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:39 np0005466012 python3.9[158313]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:39 np0005466012 python3.9[158465]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:50:40 np0005466012 python3.9[158617]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:41 np0005466012 python3.9[158769]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:42 np0005466012 python3.9[158923]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:42 np0005466012 systemd[1]: Reloading.
Oct  2 07:50:42 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:42 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:43 np0005466012 python3.9[159113]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:50:43 np0005466012 network[159130]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:50:43 np0005466012 network[159131]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:50:43 np0005466012 network[159132]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:50:48 np0005466012 python3.9[159405]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:50:49 np0005466012 systemd[1]: Reloading.
Oct  2 07:50:49 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:50:49 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:50:50 np0005466012 python3.9[159594]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:50 np0005466012 python3.9[159746]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.071155438 +0000 UTC m=+0.036572189 container create 160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 07:50:51 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:50:51 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.0923] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/21)
Oct  2 07:50:51 np0005466012 kernel: podman0: port 1(veth0) entered blocking state
Oct  2 07:50:51 np0005466012 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:50:51 np0005466012 kernel: veth0: entered allmulticast mode
Oct  2 07:50:51 np0005466012 kernel: veth0: entered promiscuous mode
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1123] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Oct  2 07:50:51 np0005466012 kernel: podman0: port 1(veth0) entered blocking state
Oct  2 07:50:51 np0005466012 kernel: podman0: port 1(veth0) entered forwarding state
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1140] device (veth0): carrier: link connected
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1143] device (podman0): carrier: link connected
Oct  2 07:50:51 np0005466012 systemd-udevd[159807]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:50:51 np0005466012 systemd-udevd[159810]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1380] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1392] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1405] device (podman0): Activation: starting connection 'podman0' (eaa4cd2d-dcdb-4aef-b9ef-d59d38ada580)
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1409] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1414] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1418] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1425] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.053327462 +0000 UTC m=+0.018744223 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:50:51 np0005466012 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  2 07:50:51 np0005466012 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1716] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1718] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.1726] device (podman0): Activation: successful, device activated.
Oct  2 07:50:51 np0005466012 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  2 07:50:51 np0005466012 systemd[1]: Started libpod-conmon-160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653.scope.
Oct  2 07:50:51 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.403565532 +0000 UTC m=+0.368982283 container init 160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.412789293 +0000 UTC m=+0.378206054 container start 160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 07:50:51 np0005466012 iscsid_config[159939]: iqn.1994-05.com.redhat:d7c9d09dff#015
Oct  2 07:50:51 np0005466012 systemd[1]: libpod-160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653.scope: Deactivated successfully.
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.419402481 +0000 UTC m=+0.384819292 container attach 160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.420611625 +0000 UTC m=+0.386028376 container died 160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:50:51 np0005466012 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:50:51 np0005466012 kernel: veth0 (unregistering): left allmulticast mode
Oct  2 07:50:51 np0005466012 kernel: veth0 (unregistering): left promiscuous mode
Oct  2 07:50:51 np0005466012 kernel: podman0: port 1(veth0) entered disabled state
Oct  2 07:50:51 np0005466012 NetworkManager[51207]: <info>  [1759405851.4826] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 07:50:51 np0005466012 systemd[1]: run-netns-netns\x2d7090f563\x2d5b98\x2de727\x2da9bf\x2dd552c75a6a58.mount: Deactivated successfully.
Oct  2 07:50:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653-userdata-shm.mount: Deactivated successfully.
Oct  2 07:50:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay-4564181bd38fc0060f2b00eed05da51384fa87ea8cc99ddf7009acba2c46c3c4-merged.mount: Deactivated successfully.
Oct  2 07:50:51 np0005466012 podman[159781]: 2025-10-02 11:50:51.7780678 +0000 UTC m=+0.743484551 container remove 160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 07:50:51 np0005466012 systemd[1]: libpod-conmon-160cae4579f60a1a960124c4f63e83a2a7c70f9a40da17631a71c55f4ff22653.scope: Deactivated successfully.
Oct  2 07:50:51 np0005466012 python3.9[159746]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Oct  2 07:50:51 np0005466012 python3.9[159746]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  2 07:50:52 np0005466012 python3.9[160179]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:53 np0005466012 python3.9[160302]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405852.2181036-323-8743495311277/.source.iscsi _original_basename=.yqvcdzc_ follow=False checksum=333cdb97d052817ce36fce840b53868294a533d9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:54 np0005466012 python3.9[160454]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:54 np0005466012 python3.9[160604]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:50:55 np0005466012 python3.9[160758]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:50:56 np0005466012 python3.9[160910]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:57 np0005466012 python3.9[161062]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:57 np0005466012 python3.9[161140]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:58 np0005466012 python3.9[161292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:50:58 np0005466012 python3.9[161370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:50:59 np0005466012 python3.9[161522]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:00 np0005466012 python3.9[161674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:00 np0005466012 python3.9[161752]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:01 np0005466012 python3.9[161904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:01 np0005466012 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  2 07:51:01 np0005466012 python3.9[161982]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:51:02.099 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:51:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:51:02.100 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:51:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:51:02.100 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:51:02 np0005466012 python3.9[162134]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:02 np0005466012 systemd[1]: Reloading.
Oct  2 07:51:02 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:02 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:03 np0005466012 podman[162294]: 2025-10-02 11:51:03.583631194 +0000 UTC m=+0.090226732 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 07:51:03 np0005466012 python3.9[162339]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:04 np0005466012 python3.9[162426]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:04 np0005466012 python3.9[162578]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:05 np0005466012 python3.9[162656]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:06 np0005466012 python3.9[162808]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:06 np0005466012 systemd[1]: Reloading.
Oct  2 07:51:06 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:06 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:06 np0005466012 systemd[1]: Starting Create netns directory...
Oct  2 07:51:06 np0005466012 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:51:06 np0005466012 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:51:06 np0005466012 systemd[1]: Finished Create netns directory.
Oct  2 07:51:07 np0005466012 podman[162974]: 2025-10-02 11:51:07.349944028 +0000 UTC m=+0.080743121 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:51:07 np0005466012 python3.9[163017]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:08 np0005466012 python3.9[163173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:08 np0005466012 python3.9[163296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405867.7836561-785-190493011517450/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:10 np0005466012 python3.9[163448]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:10 np0005466012 python3.9[163600]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:11 np0005466012 python3.9[163723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405870.3343287-860-247863318018253/.source.json _original_basename=.2rm088w1 follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:12 np0005466012 python3.9[163875]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:14 np0005466012 python3.9[164302]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  2 07:51:15 np0005466012 python3.9[164454]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:51:16 np0005466012 python3.9[164606]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:51:18 np0005466012 python3[164784]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:51:18 np0005466012 podman[164820]: 2025-10-02 11:51:18.229503646 +0000 UTC m=+0.028800989 image pull 1b3fd7f2436e5c6f2e28c01b83721476c7b295789c77b3d63e30f49404389ea1 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:51:18 np0005466012 podman[164820]: 2025-10-02 11:51:18.405794012 +0000 UTC m=+0.205091355 container create 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_id=iscsid)
Oct  2 07:51:18 np0005466012 python3[164784]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  2 07:51:19 np0005466012 python3.9[165010]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:20 np0005466012 python3.9[165164]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:20 np0005466012 python3.9[165240]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:21 np0005466012 python3.9[165391]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405880.816756-1124-48069655282963/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:22 np0005466012 python3.9[165467]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:51:22 np0005466012 systemd[1]: Reloading.
Oct  2 07:51:22 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:22 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:23 np0005466012 python3.9[165578]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:23 np0005466012 systemd[1]: Reloading.
Oct  2 07:51:23 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:23 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:23 np0005466012 systemd[1]: Starting iscsid container...
Oct  2 07:51:23 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:51:23 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53b0d85047183a359632ca7e84bb6468c1ee9ad932b546f18066b70a862499ab/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:23 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53b0d85047183a359632ca7e84bb6468c1ee9ad932b546f18066b70a862499ab/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:23 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53b0d85047183a359632ca7e84bb6468c1ee9ad932b546f18066b70a862499ab/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:51:23 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf.
Oct  2 07:51:23 np0005466012 podman[165618]: 2025-10-02 11:51:23.818582698 +0000 UTC m=+0.318661463 container init 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:51:23 np0005466012 iscsid[165634]: + sudo -E kolla_set_configs
Oct  2 07:51:23 np0005466012 podman[165618]: 2025-10-02 11:51:23.851621304 +0000 UTC m=+0.351699949 container start 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid)
Oct  2 07:51:23 np0005466012 systemd[1]: Created slice User Slice of UID 0.
Oct  2 07:51:23 np0005466012 podman[165618]: iscsid
Oct  2 07:51:23 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  2 07:51:23 np0005466012 systemd[1]: Started iscsid container.
Oct  2 07:51:23 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  2 07:51:23 np0005466012 systemd[1]: Starting User Manager for UID 0...
Oct  2 07:51:23 np0005466012 podman[165641]: 2025-10-02 11:51:23.915798252 +0000 UTC m=+0.055633523 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 07:51:23 np0005466012 systemd[1]: 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf-28e202f16244a2f9.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:51:23 np0005466012 systemd[1]: 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf-28e202f16244a2f9.service: Failed with result 'exit-code'.
Oct  2 07:51:24 np0005466012 systemd[165660]: Queued start job for default target Main User Target.
Oct  2 07:51:24 np0005466012 systemd[165660]: Created slice User Application Slice.
Oct  2 07:51:24 np0005466012 systemd[165660]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  2 07:51:24 np0005466012 systemd[165660]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 07:51:24 np0005466012 systemd[165660]: Reached target Paths.
Oct  2 07:51:24 np0005466012 systemd[165660]: Reached target Timers.
Oct  2 07:51:24 np0005466012 systemd[165660]: Starting D-Bus User Message Bus Socket...
Oct  2 07:51:24 np0005466012 systemd[165660]: Starting Create User's Volatile Files and Directories...
Oct  2 07:51:24 np0005466012 systemd[165660]: Listening on D-Bus User Message Bus Socket.
Oct  2 07:51:24 np0005466012 systemd[165660]: Reached target Sockets.
Oct  2 07:51:24 np0005466012 systemd[165660]: Finished Create User's Volatile Files and Directories.
Oct  2 07:51:24 np0005466012 systemd[165660]: Reached target Basic System.
Oct  2 07:51:24 np0005466012 systemd[165660]: Reached target Main User Target.
Oct  2 07:51:24 np0005466012 systemd[165660]: Startup finished in 118ms.
Oct  2 07:51:24 np0005466012 systemd[1]: Started User Manager for UID 0.
Oct  2 07:51:24 np0005466012 systemd[1]: Started Session c3 of User root.
Oct  2 07:51:24 np0005466012 iscsid[165634]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:51:24 np0005466012 iscsid[165634]: INFO:__main__:Validating config file
Oct  2 07:51:24 np0005466012 iscsid[165634]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:51:24 np0005466012 iscsid[165634]: INFO:__main__:Writing out command to execute
Oct  2 07:51:24 np0005466012 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  2 07:51:24 np0005466012 iscsid[165634]: ++ cat /run_command
Oct  2 07:51:24 np0005466012 iscsid[165634]: + CMD='/usr/sbin/iscsid -f'
Oct  2 07:51:24 np0005466012 iscsid[165634]: + ARGS=
Oct  2 07:51:24 np0005466012 iscsid[165634]: + sudo kolla_copy_cacerts
Oct  2 07:51:24 np0005466012 systemd[1]: Started Session c4 of User root.
Oct  2 07:51:24 np0005466012 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  2 07:51:24 np0005466012 iscsid[165634]: + [[ ! -n '' ]]
Oct  2 07:51:24 np0005466012 iscsid[165634]: + . kolla_extend_start
Oct  2 07:51:24 np0005466012 iscsid[165634]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  2 07:51:24 np0005466012 iscsid[165634]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  2 07:51:24 np0005466012 iscsid[165634]: Running command: '/usr/sbin/iscsid -f'
Oct  2 07:51:24 np0005466012 iscsid[165634]: + umask 0022
Oct  2 07:51:24 np0005466012 iscsid[165634]: + exec /usr/sbin/iscsid -f
Oct  2 07:51:24 np0005466012 kernel: Loading iSCSI transport class v2.0-870.
Oct  2 07:51:24 np0005466012 python3.9[165840]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:25 np0005466012 python3.9[165992]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:26 np0005466012 python3.9[166144]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:51:26 np0005466012 network[166161]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:51:26 np0005466012 network[166162]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:51:26 np0005466012 network[166163]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:51:31 np0005466012 python3.9[166437]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:51:31 np0005466012 python3.9[166589]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  2 07:51:32 np0005466012 python3.9[166745]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:33 np0005466012 python3.9[166868]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405892.2690742-1347-217933367477294/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:34 np0005466012 podman[166992]: 2025-10-02 11:51:34.100193703 +0000 UTC m=+0.076598224 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:51:34 np0005466012 python3.9[167038]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:34 np0005466012 systemd[1]: Stopping User Manager for UID 0...
Oct  2 07:51:34 np0005466012 systemd[165660]: Activating special unit Exit the Session...
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped target Main User Target.
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped target Basic System.
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped target Paths.
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped target Sockets.
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped target Timers.
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 07:51:34 np0005466012 systemd[165660]: Closed D-Bus User Message Bus Socket.
Oct  2 07:51:34 np0005466012 systemd[165660]: Stopped Create User's Volatile Files and Directories.
Oct  2 07:51:34 np0005466012 systemd[165660]: Removed slice User Application Slice.
Oct  2 07:51:34 np0005466012 systemd[165660]: Reached target Shutdown.
Oct  2 07:51:34 np0005466012 systemd[165660]: Finished Exit the Session.
Oct  2 07:51:34 np0005466012 systemd[165660]: Reached target Exit the Session.
Oct  2 07:51:34 np0005466012 systemd[1]: user@0.service: Deactivated successfully.
Oct  2 07:51:34 np0005466012 systemd[1]: Stopped User Manager for UID 0.
Oct  2 07:51:34 np0005466012 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  2 07:51:34 np0005466012 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  2 07:51:34 np0005466012 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  2 07:51:34 np0005466012 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  2 07:51:34 np0005466012 systemd[1]: Removed slice User Slice of UID 0.
Oct  2 07:51:35 np0005466012 python3.9[167200]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:51:35 np0005466012 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  2 07:51:35 np0005466012 systemd[1]: Stopped Load Kernel Modules.
Oct  2 07:51:35 np0005466012 systemd[1]: Stopping Load Kernel Modules...
Oct  2 07:51:35 np0005466012 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:51:35 np0005466012 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:51:36 np0005466012 python3.9[167356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:36 np0005466012 python3.9[167508]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:37 np0005466012 podman[167632]: 2025-10-02 11:51:37.573050683 +0000 UTC m=+0.056588649 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 07:51:37 np0005466012 python3.9[167681]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:38 np0005466012 python3.9[167833]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:39 np0005466012 python3.9[167956]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405897.9273255-1520-30821769723347/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:39 np0005466012 python3.9[168108]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:51:40 np0005466012 python3.9[168261]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:41 np0005466012 python3.9[168413]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:42 np0005466012 python3.9[168565]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:43 np0005466012 python3.9[168717]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:43 np0005466012 python3.9[168869]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:44 np0005466012 python3.9[169021]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:44 np0005466012 python3.9[169173]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:45 np0005466012 python3.9[169325]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:51:46 np0005466012 python3.9[169479]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:47 np0005466012 python3.9[169631]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:48 np0005466012 python3.9[169783]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:48 np0005466012 python3.9[169861]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:49 np0005466012 python3.9[170013]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:49 np0005466012 python3.9[170091]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:50 np0005466012 python3.9[170243]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:51 np0005466012 python3.9[170395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:51 np0005466012 python3.9[170473]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:52 np0005466012 python3.9[170625]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:53 np0005466012 python3.9[170703]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:53 np0005466012 python3.9[170855]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:54 np0005466012 systemd[1]: Reloading.
Oct  2 07:51:54 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:54 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:54 np0005466012 podman[170857]: 2025-10-02 11:51:54.133482365 +0000 UTC m=+0.104683102 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:51:55 np0005466012 python3.9[171063]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:55 np0005466012 python3.9[171141]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:56 np0005466012 python3.9[171293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:56 np0005466012 python3.9[171371]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:51:57 np0005466012 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  2 07:51:57 np0005466012 python3.9[171524]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:51:57 np0005466012 systemd[1]: Reloading.
Oct  2 07:51:57 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:51:57 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:51:57 np0005466012 systemd[1]: Starting Create netns directory...
Oct  2 07:51:57 np0005466012 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  2 07:51:57 np0005466012 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  2 07:51:57 np0005466012 systemd[1]: Finished Create netns directory.
Oct  2 07:51:58 np0005466012 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  2 07:51:58 np0005466012 python3.9[171719]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:51:59 np0005466012 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  2 07:51:59 np0005466012 python3.9[171872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:51:59 np0005466012 python3.9[171995]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759405918.9932654-2141-187314840651825/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:00 np0005466012 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  2 07:52:00 np0005466012 python3.9[172148]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:52:01 np0005466012 python3.9[172300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:52:02.100 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:52:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:52:02.102 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:52:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:52:02.102 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:52:02 np0005466012 python3.9[172423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405921.1712167-2216-100268848448975/.source.json _original_basename=.x4ad6hz0 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:03 np0005466012 python3.9[172575]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:04 np0005466012 podman[172822]: 2025-10-02 11:52:04.316780134 +0000 UTC m=+0.120982485 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 07:52:05 np0005466012 python3.9[173028]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  2 07:52:06 np0005466012 python3.9[173180]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:52:06 np0005466012 python3.9[173332]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  2 07:52:08 np0005466012 podman[173435]: 2025-10-02 11:52:08.166961891 +0000 UTC m=+0.082447695 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 07:52:08 np0005466012 python3[173529]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:52:08 np0005466012 podman[173566]: 2025-10-02 11:52:08.813736287 +0000 UTC m=+0.067014779 container create 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:52:08 np0005466012 podman[173566]: 2025-10-02 11:52:08.787634964 +0000 UTC m=+0.040913476 image pull d8d739f82a6fecf9df690e49539b589e74665b54e36448657b874630717d5bd1 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:52:08 np0005466012 python3[173529]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  2 07:52:10 np0005466012 python3.9[173756]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:11 np0005466012 python3.9[173910]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:11 np0005466012 python3.9[173986]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:12 np0005466012 python3.9[174137]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759405932.0647795-2480-20815099755408/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:13 np0005466012 python3.9[174213]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:52:13 np0005466012 systemd[1]: Reloading.
Oct  2 07:52:13 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:13 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:14 np0005466012 python3.9[174324]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:14 np0005466012 systemd[1]: Reloading.
Oct  2 07:52:14 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:14 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:14 np0005466012 systemd[1]: Starting multipathd container...
Oct  2 07:52:14 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:52:14 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0656cecd62945c0c4230fcc9a722874d0ae0beaa560f59bb8ef73ed2ad6982ae/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:14 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0656cecd62945c0c4230fcc9a722874d0ae0beaa560f59bb8ef73ed2ad6982ae/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:14 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.
Oct  2 07:52:14 np0005466012 podman[174363]: 2025-10-02 11:52:14.812752989 +0000 UTC m=+0.138342085 container init 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:52:14 np0005466012 multipathd[174378]: + sudo -E kolla_set_configs
Oct  2 07:52:14 np0005466012 podman[174363]: 2025-10-02 11:52:14.844262513 +0000 UTC m=+0.169851589 container start 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:52:14 np0005466012 podman[174363]: multipathd
Oct  2 07:52:14 np0005466012 systemd[1]: Started multipathd container.
Oct  2 07:52:14 np0005466012 podman[174384]: 2025-10-02 11:52:14.91381795 +0000 UTC m=+0.059695995 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 07:52:14 np0005466012 systemd[1]: 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4-300c17aebe657228.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:52:14 np0005466012 systemd[1]: 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4-300c17aebe657228.service: Failed with result 'exit-code'.
Oct  2 07:52:14 np0005466012 multipathd[174378]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:52:14 np0005466012 multipathd[174378]: INFO:__main__:Validating config file
Oct  2 07:52:14 np0005466012 multipathd[174378]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:52:14 np0005466012 multipathd[174378]: INFO:__main__:Writing out command to execute
Oct  2 07:52:14 np0005466012 multipathd[174378]: ++ cat /run_command
Oct  2 07:52:14 np0005466012 multipathd[174378]: + CMD='/usr/sbin/multipathd -d'
Oct  2 07:52:14 np0005466012 multipathd[174378]: + ARGS=
Oct  2 07:52:14 np0005466012 multipathd[174378]: + sudo kolla_copy_cacerts
Oct  2 07:52:14 np0005466012 multipathd[174378]: + [[ ! -n '' ]]
Oct  2 07:52:14 np0005466012 multipathd[174378]: + . kolla_extend_start
Oct  2 07:52:14 np0005466012 multipathd[174378]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  2 07:52:14 np0005466012 multipathd[174378]: Running command: '/usr/sbin/multipathd -d'
Oct  2 07:52:14 np0005466012 multipathd[174378]: + umask 0022
Oct  2 07:52:14 np0005466012 multipathd[174378]: + exec /usr/sbin/multipathd -d
Oct  2 07:52:14 np0005466012 multipathd[174378]: 3928.667543 | --------start up--------
Oct  2 07:52:14 np0005466012 multipathd[174378]: 3928.667570 | read /etc/multipath.conf
Oct  2 07:52:14 np0005466012 multipathd[174378]: 3928.675362 | path checkers start up
Oct  2 07:52:15 np0005466012 python3.9[174568]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:52:16 np0005466012 python3.9[174722]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:52:17 np0005466012 python3.9[174887]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:52:17 np0005466012 systemd[1]: Stopping multipathd container...
Oct  2 07:52:17 np0005466012 multipathd[174378]: 3931.036535 | exit (signal)
Oct  2 07:52:17 np0005466012 multipathd[174378]: 3931.036956 | --------shut down-------
Oct  2 07:52:17 np0005466012 systemd[1]: libpod-6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.scope: Deactivated successfully.
Oct  2 07:52:17 np0005466012 podman[174891]: 2025-10-02 11:52:17.402115134 +0000 UTC m=+0.081378107 container died 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:52:17 np0005466012 systemd[1]: 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4-300c17aebe657228.timer: Deactivated successfully.
Oct  2 07:52:17 np0005466012 systemd[1]: Stopped /usr/bin/podman healthcheck run 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.
Oct  2 07:52:17 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4-userdata-shm.mount: Deactivated successfully.
Oct  2 07:52:17 np0005466012 systemd[1]: var-lib-containers-storage-overlay-0656cecd62945c0c4230fcc9a722874d0ae0beaa560f59bb8ef73ed2ad6982ae-merged.mount: Deactivated successfully.
Oct  2 07:52:17 np0005466012 podman[174891]: 2025-10-02 11:52:17.544600822 +0000 UTC m=+0.223863785 container cleanup 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:52:17 np0005466012 podman[174891]: multipathd
Oct  2 07:52:17 np0005466012 podman[174919]: multipathd
Oct  2 07:52:17 np0005466012 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  2 07:52:17 np0005466012 systemd[1]: Stopped multipathd container.
Oct  2 07:52:17 np0005466012 systemd[1]: Starting multipathd container...
Oct  2 07:52:17 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:52:17 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0656cecd62945c0c4230fcc9a722874d0ae0beaa560f59bb8ef73ed2ad6982ae/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:17 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0656cecd62945c0c4230fcc9a722874d0ae0beaa560f59bb8ef73ed2ad6982ae/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:52:17 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.
Oct  2 07:52:17 np0005466012 podman[174932]: 2025-10-02 11:52:17.808934869 +0000 UTC m=+0.143327834 container init 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:52:17 np0005466012 multipathd[174949]: + sudo -E kolla_set_configs
Oct  2 07:52:17 np0005466012 podman[174932]: 2025-10-02 11:52:17.844747841 +0000 UTC m=+0.179140746 container start 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 07:52:17 np0005466012 podman[174932]: multipathd
Oct  2 07:52:17 np0005466012 systemd[1]: Started multipathd container.
Oct  2 07:52:17 np0005466012 multipathd[174949]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:52:17 np0005466012 multipathd[174949]: INFO:__main__:Validating config file
Oct  2 07:52:17 np0005466012 multipathd[174949]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:52:17 np0005466012 multipathd[174949]: INFO:__main__:Writing out command to execute
Oct  2 07:52:17 np0005466012 multipathd[174949]: ++ cat /run_command
Oct  2 07:52:17 np0005466012 multipathd[174949]: + CMD='/usr/sbin/multipathd -d'
Oct  2 07:52:17 np0005466012 multipathd[174949]: + ARGS=
Oct  2 07:52:17 np0005466012 multipathd[174949]: + sudo kolla_copy_cacerts
Oct  2 07:52:17 np0005466012 multipathd[174949]: Running command: '/usr/sbin/multipathd -d'
Oct  2 07:52:17 np0005466012 multipathd[174949]: + [[ ! -n '' ]]
Oct  2 07:52:17 np0005466012 multipathd[174949]: + . kolla_extend_start
Oct  2 07:52:17 np0005466012 multipathd[174949]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  2 07:52:17 np0005466012 multipathd[174949]: + umask 0022
Oct  2 07:52:17 np0005466012 multipathd[174949]: + exec /usr/sbin/multipathd -d
Oct  2 07:52:17 np0005466012 multipathd[174949]: 3931.612805 | --------start up--------
Oct  2 07:52:17 np0005466012 multipathd[174949]: 3931.612819 | read /etc/multipath.conf
Oct  2 07:52:17 np0005466012 podman[174956]: 2025-10-02 11:52:17.937888032 +0000 UTC m=+0.085183512 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:52:17 np0005466012 multipathd[174949]: 3931.619417 | path checkers start up
Oct  2 07:52:17 np0005466012 systemd[1]: 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4-5b7d79fd5e33544d.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:52:17 np0005466012 systemd[1]: 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4-5b7d79fd5e33544d.service: Failed with result 'exit-code'.
Oct  2 07:52:18 np0005466012 python3.9[175141]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:19 np0005466012 python3.9[175293]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  2 07:52:20 np0005466012 python3.9[175445]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  2 07:52:20 np0005466012 kernel: Key type psk registered
Oct  2 07:52:21 np0005466012 python3.9[175608]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:52:22 np0005466012 python3.9[175731]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759405940.8389304-2720-203515306894758/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:23 np0005466012 python3.9[175883]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:24 np0005466012 python3.9[176035]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:52:24 np0005466012 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  2 07:52:24 np0005466012 systemd[1]: Stopped Load Kernel Modules.
Oct  2 07:52:24 np0005466012 systemd[1]: Stopping Load Kernel Modules...
Oct  2 07:52:24 np0005466012 systemd[1]: Starting Load Kernel Modules...
Oct  2 07:52:24 np0005466012 systemd[1]: Finished Load Kernel Modules.
Oct  2 07:52:24 np0005466012 podman[176163]: 2025-10-02 11:52:24.704436477 +0000 UTC m=+0.050831270 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:52:24 np0005466012 python3.9[176208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  2 07:52:26 np0005466012 python3.9[176293]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  2 07:52:32 np0005466012 systemd[1]: Reloading.
Oct  2 07:52:32 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:32 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:32 np0005466012 systemd[1]: Reloading.
Oct  2 07:52:32 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:32 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:32 np0005466012 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  2 07:52:32 np0005466012 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  2 07:52:32 np0005466012 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  2 07:52:32 np0005466012 systemd[1]: Starting man-db-cache-update.service...
Oct  2 07:52:32 np0005466012 systemd[1]: Reloading.
Oct  2 07:52:33 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:33 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:33 np0005466012 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  2 07:52:34 np0005466012 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  2 07:52:34 np0005466012 systemd[1]: Finished man-db-cache-update.service.
Oct  2 07:52:34 np0005466012 systemd[1]: man-db-cache-update.service: Consumed 1.595s CPU time.
Oct  2 07:52:34 np0005466012 systemd[1]: run-r3ae250ea3292465995c6d6bceff3490f.service: Deactivated successfully.
Oct  2 07:52:34 np0005466012 podman[177619]: 2025-10-02 11:52:34.541740277 +0000 UTC m=+0.072338546 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:52:35 np0005466012 python3.9[177774]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:35 np0005466012 python3.9[177924]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:52:36 np0005466012 python3.9[178080]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:38 np0005466012 python3.9[178232]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:52:38 np0005466012 systemd[1]: Reloading.
Oct  2 07:52:38 np0005466012 podman[178234]: 2025-10-02 11:52:38.453391268 +0000 UTC m=+0.068206001 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:52:38 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:52:38 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:52:39 np0005466012 python3.9[178436]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:52:39 np0005466012 network[178453]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:52:39 np0005466012 network[178454]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:52:39 np0005466012 network[178455]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:52:44 np0005466012 python3.9[178732]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:45 np0005466012 python3.9[178885]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:45 np0005466012 python3.9[179038]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:46 np0005466012 python3.9[179191]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:47 np0005466012 python3.9[179344]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:48 np0005466012 python3.9[179497]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:48 np0005466012 podman[179498]: 2025-10-02 11:52:48.150324979 +0000 UTC m=+0.063113941 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:52:48 np0005466012 python3.9[179671]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:49 np0005466012 python3.9[179824]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:52:51 np0005466012 python3.9[179977]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:52 np0005466012 python3.9[180129]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:53 np0005466012 python3.9[180281]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:53 np0005466012 python3.9[180433]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:54 np0005466012 python3.9[180585]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:54 np0005466012 podman[180737]: 2025-10-02 11:52:54.849609328 +0000 UTC m=+0.069173258 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 07:52:54 np0005466012 python3.9[180738]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:55 np0005466012 python3.9[180909]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:56 np0005466012 python3.9[181061]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:57 np0005466012 python3.9[181213]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:57 np0005466012 python3.9[181365]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:58 np0005466012 python3.9[181517]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:59 np0005466012 python3.9[181669]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:52:59 np0005466012 python3.9[181821]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:00 np0005466012 python3.9[181973]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:00 np0005466012 python3.9[182125]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:01 np0005466012 python3.9[182277]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:53:02.102 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:53:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:53:02.102 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:53:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:53:02.102 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:53:02 np0005466012 python3.9[182429]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:03 np0005466012 python3.9[182581]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:53:04 np0005466012 python3.9[182733]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:53:04 np0005466012 systemd[1]: Reloading.
Oct  2 07:53:05 np0005466012 podman[182735]: 2025-10-02 11:53:05.076152487 +0000 UTC m=+0.086762345 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 07:53:05 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:05 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:06 np0005466012 python3.9[182946]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:06 np0005466012 python3.9[183099]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:07 np0005466012 python3.9[183252]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:07 np0005466012 python3.9[183405]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:08 np0005466012 python3.9[183558]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:09 np0005466012 podman[183683]: 2025-10-02 11:53:09.048416029 +0000 UTC m=+0.057410833 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  2 07:53:09 np0005466012 python3.9[183728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:09 np0005466012 python3.9[183881]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:10 np0005466012 python3.9[184035]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:53:12 np0005466012 python3.9[184188]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:13 np0005466012 python3.9[184340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:13 np0005466012 python3.9[184492]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:14 np0005466012 python3.9[184644]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:15 np0005466012 python3.9[184796]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:15 np0005466012 python3.9[184948]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:16 np0005466012 python3.9[185100]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:17 np0005466012 python3.9[185252]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:17 np0005466012 python3.9[185404]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:18 np0005466012 podman[185528]: 2025-10-02 11:53:18.366867537 +0000 UTC m=+0.049352270 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 07:53:18 np0005466012 python3.9[185575]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:19 np0005466012 python3.9[185727]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:19 np0005466012 python3.9[185879]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:24 np0005466012 podman[186003]: 2025-10-02 11:53:24.938404615 +0000 UTC m=+0.049794391 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:53:25 np0005466012 python3.9[186053]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  2 07:53:25 np0005466012 python3.9[186206]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:53:27 np0005466012 python3.9[186364]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:53:28 np0005466012 systemd-logind[827]: New session 27 of user zuul.
Oct  2 07:53:28 np0005466012 systemd[1]: Started Session 27 of User zuul.
Oct  2 07:53:28 np0005466012 systemd[1]: session-27.scope: Deactivated successfully.
Oct  2 07:53:28 np0005466012 systemd-logind[827]: Session 27 logged out. Waiting for processes to exit.
Oct  2 07:53:28 np0005466012 systemd-logind[827]: Removed session 27.
Oct  2 07:53:29 np0005466012 python3.9[186550]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:29 np0005466012 python3.9[186671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406008.6822705-4337-170588893537785/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:30 np0005466012 python3.9[186821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:30 np0005466012 python3.9[186897]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:31 np0005466012 python3.9[187047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:32 np0005466012 python3.9[187168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406011.0148444-4337-276506550733101/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:32 np0005466012 python3.9[187318]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:33 np0005466012 python3.9[187439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406012.3554337-4337-188266317790645/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:34 np0005466012 python3.9[187589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:34 np0005466012 python3.9[187710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406013.4998026-4337-77612387327568/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:35 np0005466012 python3.9[187862]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:35 np0005466012 podman[187863]: 2025-10-02 11:53:35.576025345 +0000 UTC m=+0.077698954 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:53:36 np0005466012 python3.9[188041]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:37 np0005466012 python3.9[188193]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:37 np0005466012 python3.9[188345]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:38 np0005466012 python3.9[188468]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759406017.3908503-4617-247658921470096/.source _original_basename=._0xw2ere follow=False checksum=e1799ed2bd270957b22c075c7fd9ed31771edd6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  2 07:53:39 np0005466012 podman[188594]: 2025-10-02 11:53:39.131487577 +0000 UTC m=+0.049579985 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 07:53:39 np0005466012 python3.9[188634]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:40 np0005466012 python3.9[188788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:40 np0005466012 python3.9[188909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406019.5761144-4694-127805932889666/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:41 np0005466012 python3.9[189059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:53:42 np0005466012 python3.9[189180]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406021.034984-4739-83914721830430/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:53:43 np0005466012 python3.9[189332]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  2 07:53:44 np0005466012 python3.9[189484]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:53:44 np0005466012 python3[189639]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:53:45 np0005466012 podman[189676]: 2025-10-02 11:53:45.079410156 +0000 UTC m=+0.021736584 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:45 np0005466012 podman[189676]: 2025-10-02 11:53:45.184040827 +0000 UTC m=+0.126367225 container create ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251001)
Oct  2 07:53:45 np0005466012 python3[189639]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  2 07:53:45 np0005466012 python3.9[189864]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:47 np0005466012 python3.9[190018]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  2 07:53:47 np0005466012 python3.9[190170]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:53:48 np0005466012 podman[190295]: 2025-10-02 11:53:48.725855855 +0000 UTC m=+0.052085821 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:53:48 np0005466012 python3[190345]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:53:49 np0005466012 podman[190379]: 2025-10-02 11:53:49.159117621 +0000 UTC m=+0.051853575 container create 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:53:49 np0005466012 podman[190379]: 2025-10-02 11:53:49.127745769 +0000 UTC m=+0.020481763 image pull e36f31143f26011980def9337d375f895bea59b742a3a2b372b996aa8ad58eba quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  2 07:53:49 np0005466012 python3[190345]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct  2 07:53:50 np0005466012 python3.9[190571]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:50 np0005466012 python3.9[190725]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:51 np0005466012 python3.9[190876]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406030.9833887-5015-124985250895460/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:53:52 np0005466012 python3.9[190952]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:53:52 np0005466012 systemd[1]: Reloading.
Oct  2 07:53:52 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:52 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:53 np0005466012 python3.9[191065]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:53:53 np0005466012 systemd[1]: Reloading.
Oct  2 07:53:53 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:53:53 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:53:53 np0005466012 systemd[1]: Starting nova_compute container...
Oct  2 07:53:53 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:53:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:53:53 np0005466012 podman[191105]: 2025-10-02 11:53:53.88083547 +0000 UTC m=+0.133878450 container init 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  2 07:53:53 np0005466012 podman[191105]: 2025-10-02 11:53:53.886773014 +0000 UTC m=+0.139815974 container start 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 07:53:53 np0005466012 nova_compute[191120]: + sudo -E kolla_set_configs
Oct  2 07:53:53 np0005466012 podman[191105]: nova_compute
Oct  2 07:53:53 np0005466012 systemd[1]: Started nova_compute container.
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Validating config file
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying service configuration files
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Deleting /etc/ceph
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Creating directory /etc/ceph
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /etc/ceph
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Writing out command to execute
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:53:53 np0005466012 nova_compute[191120]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:53:53 np0005466012 nova_compute[191120]: ++ cat /run_command
Oct  2 07:53:53 np0005466012 nova_compute[191120]: + CMD=nova-compute
Oct  2 07:53:53 np0005466012 nova_compute[191120]: + ARGS=
Oct  2 07:53:53 np0005466012 nova_compute[191120]: + sudo kolla_copy_cacerts
Oct  2 07:53:54 np0005466012 nova_compute[191120]: + [[ ! -n '' ]]
Oct  2 07:53:54 np0005466012 nova_compute[191120]: + . kolla_extend_start
Oct  2 07:53:54 np0005466012 nova_compute[191120]: Running command: 'nova-compute'
Oct  2 07:53:54 np0005466012 nova_compute[191120]: + echo 'Running command: '\''nova-compute'\'''
Oct  2 07:53:54 np0005466012 nova_compute[191120]: + umask 0022
Oct  2 07:53:54 np0005466012 nova_compute[191120]: + exec nova-compute
Oct  2 07:53:55 np0005466012 python3.9[191283]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:55 np0005466012 podman[191284]: 2025-10-02 11:53:55.165818374 +0000 UTC m=+0.083642248 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:53:55 np0005466012 nova_compute[191120]: 2025-10-02 11:53:55.949 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:55 np0005466012 nova_compute[191120]: 2025-10-02 11:53:55.949 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:55 np0005466012 nova_compute[191120]: 2025-10-02 11:53:55.949 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:53:55 np0005466012 nova_compute[191120]: 2025-10-02 11:53:55.949 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  2 07:53:56 np0005466012 python3.9[191457]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:56 np0005466012 nova_compute[191120]: 2025-10-02 11:53:56.083 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:53:56 np0005466012 nova_compute[191120]: 2025-10-02 11:53:56.095 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:53:56 np0005466012 python3.9[191609]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:53:57 np0005466012 nova_compute[191120]: 2025-10-02 11:53:57.985 2 INFO nova.virt.driver [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.112 2 INFO nova.compute.provider_config [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.273 2 DEBUG oslo_concurrency.lockutils [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.274 2 DEBUG oslo_concurrency.lockutils [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.274 2 DEBUG oslo_concurrency.lockutils [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.274 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.275 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.276 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.277 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.278 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.279 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.280 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.281 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.282 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.283 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.284 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.285 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.286 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.287 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.288 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.289 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.290 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.291 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.292 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.293 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.294 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.295 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.296 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.297 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.298 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.299 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.300 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.301 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.302 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.303 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.304 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.305 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.306 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.307 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.308 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.309 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.310 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.311 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.312 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.313 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.314 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.315 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.316 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.317 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.318 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.319 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.320 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.321 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.322 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.323 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.324 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.325 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.326 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.327 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.328 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.329 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.330 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.331 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.332 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.333 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.334 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.335 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.336 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.337 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.338 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.339 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.340 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.341 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.342 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.343 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.344 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.345 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.346 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.347 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.348 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.349 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.350 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.351 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.352 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.353 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.354 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.355 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.356 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.357 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.358 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.359 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.360 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.361 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.362 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.363 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.364 2 WARNING oslo_config.cfg [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  2 07:53:58 np0005466012 nova_compute[191120]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  2 07:53:58 np0005466012 nova_compute[191120]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  2 07:53:58 np0005466012 nova_compute[191120]: and ``live_migration_inbound_addr`` respectively.
Oct  2 07:53:58 np0005466012 nova_compute[191120]: ).  Its value may be silently ignored in the future.#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.364 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.365 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.366 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.367 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.368 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.369 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.370 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.371 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.372 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.373 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.374 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.375 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.376 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.377 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.378 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.379 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.380 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.381 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.382 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.383 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.384 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.385 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.386 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.387 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.388 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.389 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.390 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.391 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.392 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.393 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.395 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.396 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.397 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.397 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.397 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.397 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.397 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.397 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.398 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.398 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.398 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.398 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.398 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.399 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.399 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.399 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.399 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.399 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.399 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.400 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.400 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.400 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.400 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.400 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.401 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.402 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.402 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.402 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.402 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.402 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.402 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.403 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.403 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.403 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.403 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.403 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.403 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.404 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.404 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.404 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.404 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.404 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.404 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.405 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.406 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.407 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.408 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.408 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.408 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.408 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.408 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.408 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.409 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.410 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.411 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.412 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.413 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.414 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.414 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.414 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.414 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.414 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.414 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.415 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.416 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.416 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.416 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.416 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.416 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.417 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.418 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.418 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.418 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.418 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.418 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.419 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.420 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.420 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.420 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.420 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.420 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.421 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.421 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.421 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.421 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.421 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.421 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.422 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.422 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.422 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.422 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.422 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.423 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.423 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.423 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.423 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.423 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.423 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.424 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.424 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.424 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.424 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.424 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.424 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.425 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.425 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.425 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.425 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.425 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.426 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.426 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.426 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.426 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.426 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.426 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.427 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.427 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.427 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.427 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.427 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.428 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.428 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.428 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.428 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.428 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.428 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.429 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.429 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.429 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.429 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.429 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.429 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.430 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.431 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.431 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.431 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.431 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.431 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.431 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.432 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.432 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.432 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.432 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.432 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.432 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.433 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.433 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.433 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.433 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.433 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.433 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.434 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.435 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.436 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.436 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.436 2 DEBUG oslo_service.service [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.437 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.477 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.478 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.478 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.478 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  2 07:53:58 np0005466012 systemd[1]: Starting libvirt QEMU daemon...
Oct  2 07:53:58 np0005466012 systemd[1]: Started libvirt QEMU daemon.
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.552 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f047c7d9f40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.556 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f047c7d9f40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.557 2 INFO nova.virt.libvirt.driver [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.592 2 WARNING nova.virt.libvirt.driver [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  2 07:53:58 np0005466012 nova_compute[191120]: 2025-10-02 11:53:58.593 2 DEBUG nova.virt.libvirt.volume.mount [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  2 07:53:58 np0005466012 python3.9[191761]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:53:58 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.423 2 INFO nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Libvirt host capabilities <capabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <host>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <uuid>ebf0c2d8-5045-4169-abec-cc6f6092e35d</uuid>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <arch>x86_64</arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <microcode version='16777317'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <signature family='23' model='49' stepping='0'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='x2apic'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='tsc-deadline'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='osxsave'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='hypervisor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='tsc_adjust'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='spec-ctrl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='stibp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='arch-capabilities'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='cmp_legacy'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='topoext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='virt-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='lbrv'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='tsc-scale'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='vmcb-clean'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='pause-filter'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='pfthreshold'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='rdctl-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='mds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <pages unit='KiB' size='4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <pages unit='KiB' size='2048'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <pages unit='KiB' size='1048576'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <power_management>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <suspend_mem/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <suspend_disk/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <suspend_hybrid/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </power_management>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <iommu support='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <migration_features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <live/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <uri_transports>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <uri_transport>tcp</uri_transport>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <uri_transport>rdma</uri_transport>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </uri_transports>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </migration_features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <topology>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <cells num='1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <cell id='0'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          <memory unit='KiB'>7864104</memory>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          <pages unit='KiB' size='4'>1966026</pages>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          <pages unit='KiB' size='2048'>0</pages>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          <distances>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <sibling id='0' value='10'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          </distances>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          <cpus num='8'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:          </cpus>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        </cell>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </cells>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </topology>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <cache>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </cache>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <secmodel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model>selinux</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <doi>0</doi>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </secmodel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <secmodel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model>dac</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <doi>0</doi>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </secmodel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </host>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <guest>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <os_type>hvm</os_type>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <arch name='i686'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <wordsize>32</wordsize>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <domain type='qemu'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <domain type='kvm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <pae/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <nonpae/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <apic default='on' toggle='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <cpuselection/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <deviceboot/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <externalSnapshot/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </guest>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <guest>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <os_type>hvm</os_type>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <arch name='x86_64'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <wordsize>64</wordsize>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <domain type='qemu'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <domain type='kvm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <acpi default='on' toggle='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <apic default='on' toggle='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <cpuselection/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <deviceboot/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <externalSnapshot/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </guest>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 
Oct  2 07:53:59 np0005466012 nova_compute[191120]: </capabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: #033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.429 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.451 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  2 07:53:59 np0005466012 nova_compute[191120]: <domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <arch>i686</arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <vcpu max='4096'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <os supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='firmware'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>rom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pflash</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>yes</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='secure'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </loader>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </os>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>file</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>memfd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </memoryBacking>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>disk</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>floppy</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>lun</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>fdc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>sata</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </disk>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vnc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>dbus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </graphics>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <video supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vga</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>none</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>bochs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </video>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='mode'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>requisite</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>optional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pci</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hostdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>random</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </rng>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>path</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>handle</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </filesystem>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emulator</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>external</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>2.0</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </tpm>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </redirdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pty</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>unix</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </channel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>qemu</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </crypto>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>passt</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </interface>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>isa</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </panic>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='features'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vapic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>runtime</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>synic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>stimer</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reset</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ipi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>avic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hyperv>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: </domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.456 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  2 07:53:59 np0005466012 nova_compute[191120]: <domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <arch>i686</arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <vcpu max='240'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <os supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='firmware'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>rom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pflash</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>yes</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='secure'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </loader>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </os>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>file</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>memfd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </memoryBacking>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>disk</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>floppy</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>lun</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ide</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>fdc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>sata</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </disk>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vnc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>dbus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </graphics>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <video supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vga</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>none</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>bochs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </video>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='mode'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>requisite</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>optional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pci</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hostdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>random</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </rng>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>path</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>handle</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </filesystem>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emulator</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>external</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>2.0</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </tpm>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </redirdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pty</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>unix</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </channel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>qemu</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </crypto>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>passt</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </interface>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>isa</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </panic>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='features'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vapic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>runtime</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>synic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>stimer</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reset</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ipi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>avic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hyperv>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: </domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.488 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.493 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  2 07:53:59 np0005466012 nova_compute[191120]: <domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <arch>x86_64</arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <vcpu max='4096'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <os supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='firmware'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>efi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>rom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pflash</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>yes</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='secure'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>yes</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </loader>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </os>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>file</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>memfd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </memoryBacking>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>disk</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>floppy</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>lun</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>fdc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>sata</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </disk>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vnc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>dbus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </graphics>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <video supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vga</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>none</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>bochs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </video>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='mode'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>requisite</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>optional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pci</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hostdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>random</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </rng>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>path</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>handle</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </filesystem>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emulator</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>external</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>2.0</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </tpm>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </redirdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pty</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>unix</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </channel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>qemu</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </crypto>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>passt</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </interface>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>isa</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </panic>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='features'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vapic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>runtime</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>synic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>stimer</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reset</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ipi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>avic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hyperv>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: </domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.554 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  2 07:53:59 np0005466012 nova_compute[191120]: <domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <domain>kvm</domain>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <arch>x86_64</arch>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <vcpu max='240'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <iothreads supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <os supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='firmware'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <loader supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>rom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pflash</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='readonly'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>yes</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='secure'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>no</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </loader>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </os>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='maximum' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='maximumMigratable'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>on</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>off</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='host-model' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <vendor>AMD</vendor>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='x2apic'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='stibp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='succor'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lbrv'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='mds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='gds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <mode name='custom' supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Broadwell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Cooperlake-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Denverton-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Dhyana-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='auto-ibrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amd-psfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='no-nested-data-bp'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='null-sel-clr-base'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='stibp-always-on'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='EPYC-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-128'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-256'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx10-512'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='prefetchiti'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Haswell-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='IvyBridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='KnightsMill-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4fmaps'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-4vnniw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512er'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512pf'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fma4'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tbm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xop'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='amx-tile'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-bf16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-fp16'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bitalg'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vbmi2'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrc'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fzrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='la57'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='taa-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='tsx-ldtrk'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xfd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='SierraForest-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ifma'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-ne-convert'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx-vnni-int8'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='bus-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cmpccxadd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fbsdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='fsrs'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ibrs-all'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mcdt-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pbrsb-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='psdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='serialize'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vaes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='vpclmulqdq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='hle'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='rtm'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512bw'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512cd'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512dq'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512f'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='avx512vl'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='invpcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pcid'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='pku'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='mpx'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v2'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v3'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='core-capability'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='split-lock-detect'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='Snowridge-v4'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='cldemote'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='erms'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='gfni'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdir64b'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='movdiri'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='xsaves'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='athlon-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='core2duo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='coreduo-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='n270-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='ss'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <blockers model='phenom-v1'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnow'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <feature name='3dnowext'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </blockers>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </mode>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <memoryBacking supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <enum name='sourceType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>file</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>anonymous</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <value>memfd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </memoryBacking>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <disk supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='diskDevice'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>disk</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cdrom</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>floppy</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>lun</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ide</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>fdc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>sata</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </disk>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <graphics supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vnc</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egl-headless</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>dbus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </graphics>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <video supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='modelType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vga</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>cirrus</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>none</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>bochs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ramfb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </video>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hostdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='mode'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>subsystem</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='startupPolicy'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>mandatory</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>requisite</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>optional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='subsysType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pci</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>scsi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='capsType'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='pciBackend'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hostdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <rng supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtio-non-transitional</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>random</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>egd</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </rng>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <filesystem supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='driverType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>path</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>handle</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>virtiofs</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </filesystem>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <tpm supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-tis</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tpm-crb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emulator</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>external</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendVersion'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>2.0</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </tpm>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <redirdev supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='bus'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>usb</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </redirdev>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <channel supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>pty</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>unix</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </channel>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <crypto supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='type'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>qemu</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendModel'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>builtin</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </crypto>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <interface supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='backendType'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>default</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>passt</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </interface>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <panic supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='model'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>isa</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>hyperv</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </panic>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </devices>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <gic supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <vmcoreinfo supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <genid supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backingStoreInput supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <backup supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <async-teardown supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <ps2 supported='yes'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sev supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <sgx supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <hyperv supported='yes'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      <enum name='features'>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>relaxed</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vapic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>spinlocks</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vpindex</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>runtime</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>synic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>stimer</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reset</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>vendor_id</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>frequencies</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>reenlightenment</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>tlbflush</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>ipi</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>avic</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>emsr_bitmap</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:        <value>xmm_input</value>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:      </enum>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    </hyperv>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:    <launchSecurity supported='no'/>
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  </features>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: </domainCapabilities>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.609 2 DEBUG nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.610 2 INFO nova.virt.libvirt.host [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Secure Boot support detected#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.612 2 INFO nova.virt.libvirt.driver [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.612 2 INFO nova.virt.libvirt.driver [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.627 2 DEBUG nova.virt.libvirt.driver [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  2 07:53:59 np0005466012 nova_compute[191120]:  <model>Nehalem</model>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: </cpu>
Oct  2 07:53:59 np0005466012 nova_compute[191120]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.632 2 DEBUG nova.virt.libvirt.driver [None req-9c59075b-b32d-456d-bdab-897c1e76c1d5 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  2 07:53:59 np0005466012 python3.9[192001]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:53:59 np0005466012 systemd[1]: Stopping nova_compute container...
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.887 2 DEBUG oslo_concurrency.lockutils [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.888 2 DEBUG oslo_concurrency.lockutils [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:53:59 np0005466012 nova_compute[191120]: 2025-10-02 11:53:59.888 2 DEBUG oslo_concurrency.lockutils [None req-28a882ad-ee49-48b1-b9db-22ca7e73220f - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:54:00 np0005466012 virtqemud[191783]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  2 07:54:00 np0005466012 virtqemud[191783]: hostname: compute-1
Oct  2 07:54:00 np0005466012 virtqemud[191783]: End of file while reading data: Input/output error
Oct  2 07:54:00 np0005466012 systemd[1]: libpod-79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356.scope: Deactivated successfully.
Oct  2 07:54:00 np0005466012 systemd[1]: libpod-79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356.scope: Consumed 3.149s CPU time.
Oct  2 07:54:00 np0005466012 podman[192005]: 2025-10-02 11:54:00.400821874 +0000 UTC m=+0.557573998 container died 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2)
Oct  2 07:54:00 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356-userdata-shm.mount: Deactivated successfully.
Oct  2 07:54:00 np0005466012 systemd[1]: var-lib-containers-storage-overlay-4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06-merged.mount: Deactivated successfully.
Oct  2 07:54:00 np0005466012 podman[192005]: 2025-10-02 11:54:00.450304675 +0000 UTC m=+0.607056799 container cleanup 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Oct  2 07:54:00 np0005466012 podman[192005]: nova_compute
Oct  2 07:54:00 np0005466012 podman[192035]: nova_compute
Oct  2 07:54:00 np0005466012 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  2 07:54:00 np0005466012 systemd[1]: Stopped nova_compute container.
Oct  2 07:54:00 np0005466012 systemd[1]: Starting nova_compute container...
Oct  2 07:54:00 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:54:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9b11f4e86b9dda160a5c815dcf1aef6f5517c91e2b71b2f32c2bfe07361e06/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:00 np0005466012 podman[192048]: 2025-10-02 11:54:00.617249091 +0000 UTC m=+0.076477062 container init 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3)
Oct  2 07:54:00 np0005466012 podman[192048]: 2025-10-02 11:54:00.626185323 +0000 UTC m=+0.085413294 container start 79d7261bc642b1d250fcfb7e93a6c598370b3b815ed0ab4728b185c0c058d356 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute)
Oct  2 07:54:00 np0005466012 podman[192048]: nova_compute
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + sudo -E kolla_set_configs
Oct  2 07:54:00 np0005466012 systemd[1]: Started nova_compute container.
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Validating config file
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying service configuration files
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /etc/ceph
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Creating directory /etc/ceph
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /etc/ceph
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Writing out command to execute
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:00 np0005466012 nova_compute[192063]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  2 07:54:00 np0005466012 nova_compute[192063]: ++ cat /run_command
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + CMD=nova-compute
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + ARGS=
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + sudo kolla_copy_cacerts
Oct  2 07:54:00 np0005466012 nova_compute[192063]: Running command: 'nova-compute'
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + [[ ! -n '' ]]
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + . kolla_extend_start
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + echo 'Running command: '\''nova-compute'\'''
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + umask 0022
Oct  2 07:54:00 np0005466012 nova_compute[192063]: + exec nova-compute
Oct  2 07:54:01 np0005466012 python3.9[192226]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  2 07:54:01 np0005466012 systemd[1]: Started libpod-conmon-ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f.scope.
Oct  2 07:54:01 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:54:01 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa7ff24eb3a064a832179ee01e3807dd5728704bf8052b0794acc61e542bc924/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:01 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa7ff24eb3a064a832179ee01e3807dd5728704bf8052b0794acc61e542bc924/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:01 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa7ff24eb3a064a832179ee01e3807dd5728704bf8052b0794acc61e542bc924/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  2 07:54:01 np0005466012 podman[192252]: 2025-10-02 11:54:01.88001648 +0000 UTC m=+0.133892721 container init ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:54:01 np0005466012 podman[192252]: 2025-10-02 11:54:01.892491053 +0000 UTC m=+0.146367284 container start ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:54:01 np0005466012 python3.9[192226]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Applying nova statedir ownership
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  2 07:54:01 np0005466012 nova_compute_init[192274]: INFO:nova_statedir:Nova statedir ownership complete
Oct  2 07:54:01 np0005466012 systemd[1]: libpod-ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f.scope: Deactivated successfully.
Oct  2 07:54:02 np0005466012 podman[192288]: 2025-10-02 11:54:02.010077309 +0000 UTC m=+0.030953413 container died ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Oct  2 07:54:02 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f-userdata-shm.mount: Deactivated successfully.
Oct  2 07:54:02 np0005466012 systemd[1]: var-lib-containers-storage-overlay-fa7ff24eb3a064a832179ee01e3807dd5728704bf8052b0794acc61e542bc924-merged.mount: Deactivated successfully.
Oct  2 07:54:02 np0005466012 podman[192288]: 2025-10-02 11:54:02.047211831 +0000 UTC m=+0.068087915 container cleanup ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3)
Oct  2 07:54:02 np0005466012 systemd[1]: libpod-conmon-ed15976631a4e1e1255e0530f3fd5bec46b4af68aefe1ced91e7734517d1892f.scope: Deactivated successfully.
Oct  2 07:54:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:54:02.103 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:54:02.104 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:54:02.105 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:02 np0005466012 nova_compute[192063]: 2025-10-02 11:54:02.658 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:54:02 np0005466012 nova_compute[192063]: 2025-10-02 11:54:02.658 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:54:02 np0005466012 nova_compute[192063]: 2025-10-02 11:54:02.658 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  2 07:54:02 np0005466012 nova_compute[192063]: 2025-10-02 11:54:02.659 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  2 07:54:02 np0005466012 nova_compute[192063]: 2025-10-02 11:54:02.782 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 07:54:02 np0005466012 nova_compute[192063]: 2025-10-02 11:54:02.812 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 07:54:02 np0005466012 systemd[1]: session-25.scope: Deactivated successfully.
Oct  2 07:54:02 np0005466012 systemd[1]: session-25.scope: Consumed 2min 15.644s CPU time.
Oct  2 07:54:02 np0005466012 systemd-logind[827]: Session 25 logged out. Waiting for processes to exit.
Oct  2 07:54:02 np0005466012 systemd-logind[827]: Removed session 25.
Oct  2 07:54:03 np0005466012 nova_compute[192063]: 2025-10-02 11:54:03.847 2 INFO nova.virt.driver [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  2 07:54:03 np0005466012 nova_compute[192063]: 2025-10-02 11:54:03.938 2 INFO nova.compute.provider_config [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.310 2 DEBUG oslo_concurrency.lockutils [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.310 2 DEBUG oslo_concurrency.lockutils [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.311 2 DEBUG oslo_concurrency.lockutils [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.311 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.311 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.312 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.312 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.312 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.313 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.313 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.313 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.313 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.314 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.314 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.314 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.315 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.315 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.315 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.316 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.316 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.316 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.316 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.317 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.317 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.317 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.317 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.318 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.318 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.318 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.318 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.319 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.319 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.319 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.319 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.320 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.320 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.320 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.321 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.322 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.323 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.324 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.325 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.326 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.327 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.328 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.329 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.330 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.331 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.332 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.333 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.334 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.335 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.336 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.337 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.338 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.339 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.340 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.341 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.342 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.343 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.344 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.345 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.346 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.347 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.348 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.349 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.350 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.351 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.352 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.353 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.354 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.355 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.356 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.357 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.358 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.359 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.360 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.361 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.362 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.363 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.364 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.365 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.366 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.367 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.368 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.369 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.370 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.371 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.372 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.373 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.374 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.375 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.376 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.377 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.378 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.379 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.380 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.381 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.382 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.383 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.384 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.385 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.386 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.387 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.388 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.389 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.390 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.391 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.392 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.393 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.394 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.395 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.396 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.397 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.398 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.399 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.400 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.401 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.402 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.403 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.404 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.405 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.406 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.407 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.408 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.409 2 WARNING oslo_config.cfg [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  2 07:54:05 np0005466012 nova_compute[192063]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  2 07:54:05 np0005466012 nova_compute[192063]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  2 07:54:05 np0005466012 nova_compute[192063]: and ``live_migration_inbound_addr`` respectively.
Oct  2 07:54:05 np0005466012 nova_compute[192063]: ).  Its value may be silently ignored in the future.#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.409 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.410 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.411 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.412 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.413 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.414 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.415 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.416 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.417 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.418 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.419 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.420 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.421 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.422 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.423 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.424 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.425 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.426 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.427 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.428 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.429 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.430 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.431 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.432 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.433 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.434 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.435 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.436 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.437 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.438 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.439 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.440 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.441 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.442 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.443 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.444 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.445 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.446 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.447 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.448 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.449 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.450 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.451 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.452 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.453 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.454 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.455 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.456 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.457 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.458 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.459 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.460 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.461 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.462 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.463 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.464 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.465 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.465 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.465 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.465 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.466 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.467 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.468 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.469 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.469 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.469 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.469 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.469 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.469 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.470 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.471 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.472 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.472 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.472 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.472 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.472 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.473 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.474 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.474 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.474 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.474 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.474 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.474 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.475 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.476 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.476 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.476 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.476 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.476 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.476 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.477 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.478 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.478 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.478 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.478 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.478 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.478 2 DEBUG oslo_service.service [None req-9c80f0b6-c9f8-4077-9018-682b4ec53816 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.479 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.507 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.508 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.508 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.508 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.520 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4c1653af40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.523 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4c1653af40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.524 2 INFO nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.531 2 INFO nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Libvirt host capabilities <capabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <host>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <uuid>ebf0c2d8-5045-4169-abec-cc6f6092e35d</uuid>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <arch>x86_64</arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <microcode version='16777317'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <signature family='23' model='49' stepping='0'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='x2apic'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='tsc-deadline'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='osxsave'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='hypervisor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='tsc_adjust'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='spec-ctrl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='stibp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='arch-capabilities'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='cmp_legacy'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='topoext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='virt-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='lbrv'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='tsc-scale'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='vmcb-clean'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='pause-filter'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='pfthreshold'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='rdctl-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='mds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <pages unit='KiB' size='4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <pages unit='KiB' size='2048'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <pages unit='KiB' size='1048576'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <power_management>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <suspend_mem/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <suspend_disk/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <suspend_hybrid/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </power_management>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <iommu support='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <migration_features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <live/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <uri_transports>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <uri_transport>tcp</uri_transport>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <uri_transport>rdma</uri_transport>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </uri_transports>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </migration_features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <topology>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <cells num='1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <cell id='0'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          <memory unit='KiB'>7864104</memory>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          <pages unit='KiB' size='4'>1966026</pages>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          <pages unit='KiB' size='2048'>0</pages>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          <distances>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <sibling id='0' value='10'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          </distances>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          <cpus num='8'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:          </cpus>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        </cell>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </cells>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </topology>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <cache>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </cache>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <secmodel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model>selinux</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <doi>0</doi>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </secmodel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <secmodel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model>dac</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <doi>0</doi>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </secmodel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </host>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <guest>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <os_type>hvm</os_type>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <arch name='i686'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <wordsize>32</wordsize>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <domain type='qemu'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <domain type='kvm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <pae/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <nonpae/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <acpi default='on' toggle='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <apic default='on' toggle='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <cpuselection/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <deviceboot/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <externalSnapshot/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </guest>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <guest>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <os_type>hvm</os_type>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <arch name='x86_64'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <wordsize>64</wordsize>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <domain type='qemu'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <domain type='kvm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <acpi default='on' toggle='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <apic default='on' toggle='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <cpuselection/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <deviceboot/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <disksnapshot default='on' toggle='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <externalSnapshot/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </guest>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 
Oct  2 07:54:05 np0005466012 nova_compute[192063]: </capabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: #033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.537 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.541 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  2 07:54:05 np0005466012 nova_compute[192063]: <domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <arch>i686</arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <vcpu max='240'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <os supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='firmware'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>rom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pflash</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>yes</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='secure'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </loader>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </os>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>file</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>memfd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </memoryBacking>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>disk</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>floppy</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>lun</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ide</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>fdc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>sata</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </disk>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vnc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>dbus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <video supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vga</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>none</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>bochs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </video>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='mode'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>requisite</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>optional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pci</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hostdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>random</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </rng>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>path</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>handle</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </filesystem>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emulator</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>external</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>2.0</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </tpm>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </redirdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pty</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>unix</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </channel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>qemu</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </crypto>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>passt</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </interface>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>isa</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </panic>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='features'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vapic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>runtime</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>synic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>stimer</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reset</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ipi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>avic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hyperv>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: </domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.547 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  2 07:54:05 np0005466012 nova_compute[192063]: <domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <arch>i686</arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <vcpu max='4096'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <os supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='firmware'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>rom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pflash</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>yes</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='secure'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </loader>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </os>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>file</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>memfd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </memoryBacking>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>disk</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>floppy</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>lun</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>fdc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>sata</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </disk>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vnc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>dbus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <video supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vga</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>none</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>bochs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </video>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='mode'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>requisite</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>optional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pci</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hostdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>random</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </rng>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>path</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>handle</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </filesystem>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emulator</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>external</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>2.0</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </tpm>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </redirdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pty</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>unix</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </channel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>qemu</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </crypto>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>passt</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </interface>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>isa</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </panic>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='features'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vapic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>runtime</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>synic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>stimer</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reset</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ipi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>avic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hyperv>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: </domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.637 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.640 2 WARNING nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.641 2 DEBUG nova.virt.libvirt.volume.mount [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.648 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  2 07:54:05 np0005466012 nova_compute[192063]: <domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <arch>x86_64</arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <vcpu max='240'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <os supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='firmware'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>rom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pflash</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>yes</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='secure'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </loader>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </os>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>file</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>memfd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </memoryBacking>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>disk</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>floppy</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>lun</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ide</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>fdc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>sata</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </disk>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vnc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>dbus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <video supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vga</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>none</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>bochs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </video>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='mode'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>requisite</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>optional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pci</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hostdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>random</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </rng>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>path</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>handle</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </filesystem>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emulator</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>external</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>2.0</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </tpm>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </redirdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pty</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>unix</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </channel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>qemu</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </crypto>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>passt</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </interface>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>isa</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </panic>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='features'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vapic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>runtime</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>synic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>stimer</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reset</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ipi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>avic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hyperv>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: </domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.655 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  2 07:54:05 np0005466012 nova_compute[192063]: <domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <path>/usr/libexec/qemu-kvm</path>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <domain>kvm</domain>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <arch>x86_64</arch>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <vcpu max='4096'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <iothreads supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <os supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='firmware'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>efi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <loader supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>rom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pflash</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='readonly'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>yes</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='secure'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>yes</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>no</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </loader>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </os>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-passthrough' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='hostPassthroughMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='maximum' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='maximumMigratable'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>on</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>off</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='host-model' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <vendor>AMD</vendor>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='x2apic'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-deadline'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='hypervisor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc_adjust'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='spec-ctrl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='stibp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='arch-capabilities'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='cmp_legacy'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='overflow-recov'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='succor'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='amd-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='virt-ssbd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lbrv'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='tsc-scale'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='vmcb-clean'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='flushbyasid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pause-filter'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pfthreshold'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='svme-addr-chk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rdctl-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='mds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='pschange-mc-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='gds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='require' name='rfds-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <feature policy='disable' name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <mode name='custom' supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Broadwell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cascadelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Cooperlake-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Denverton-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Dhyana-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Genoa-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='auto-ibrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Milan-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amd-psfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='no-nested-data-bp'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='null-sel-clr-base'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='stibp-always-on'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-Rome-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='EPYC-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='GraniteRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-128'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-256'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx10-512'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='prefetchiti'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Haswell-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-noTSX'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v6'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Icelake-Server-v7'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='IvyBridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='KnightsMill-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4fmaps'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-4vnniw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512er'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512pf'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G4-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Opteron_G5-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fma4'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tbm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xop'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SapphireRapids-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='amx-tile'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-bf16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-fp16'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512-vpopcntdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bitalg'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vbmi2'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrc'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fzrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='la57'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='taa-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='tsx-ldtrk'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xfd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='SierraForest-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ifma'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-ne-convert'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx-vnni-int8'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='bus-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cmpccxadd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fbsdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='fsrs'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ibrs-all'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mcdt-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pbrsb-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='psdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='sbdr-ssdp-no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='serialize'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vaes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='vpclmulqdq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Client-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='hle'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='rtm'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Skylake-Server-v5'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512bw'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512cd'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512dq'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512f'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='avx512vl'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='invpcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pcid'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='pku'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='mpx'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v2'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v3'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='core-capability'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='split-lock-detect'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='Snowridge-v4'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='cldemote'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='erms'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='gfni'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdir64b'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='movdiri'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='xsaves'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='athlon-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='core2duo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='coreduo-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='n270-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='ss'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <blockers model='phenom-v1'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnow'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <feature name='3dnowext'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </blockers>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </mode>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <memoryBacking supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <enum name='sourceType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>file</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>anonymous</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <value>memfd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </memoryBacking>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <disk supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='diskDevice'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>disk</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cdrom</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>floppy</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>lun</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>fdc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>sata</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </disk>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <graphics supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vnc</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egl-headless</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>dbus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <video supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='modelType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vga</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>cirrus</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>none</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>bochs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ramfb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </video>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hostdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='mode'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>subsystem</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='startupPolicy'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>mandatory</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>requisite</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>optional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='subsysType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pci</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>scsi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='capsType'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='pciBackend'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hostdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <rng supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtio-non-transitional</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>random</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>egd</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </rng>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <filesystem supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='driverType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>path</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>handle</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>virtiofs</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </filesystem>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <tpm supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-tis</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tpm-crb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emulator</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>external</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendVersion'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>2.0</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </tpm>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <redirdev supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='bus'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>usb</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </redirdev>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <channel supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>pty</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>unix</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </channel>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <crypto supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='type'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>qemu</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendModel'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>builtin</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </crypto>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <interface supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='backendType'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>default</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>passt</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </interface>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <panic supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='model'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>isa</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>hyperv</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </panic>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </devices>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <gic supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <vmcoreinfo supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <genid supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backingStoreInput supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <backup supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <async-teardown supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <ps2 supported='yes'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sev supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <sgx supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <hyperv supported='yes'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      <enum name='features'>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>relaxed</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vapic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>spinlocks</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vpindex</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>runtime</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>synic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>stimer</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reset</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>vendor_id</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>frequencies</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>reenlightenment</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>tlbflush</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>ipi</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>avic</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>emsr_bitmap</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:        <value>xmm_input</value>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:      </enum>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    </hyperv>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:    <launchSecurity supported='no'/>
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  </features>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: </domainCapabilities>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.703 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.704 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.704 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.704 2 INFO nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Secure Boot support detected#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.706 2 INFO nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.706 2 INFO nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.717 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  2 07:54:05 np0005466012 nova_compute[192063]:  <model>Nehalem</model>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: </cpu>
Oct  2 07:54:05 np0005466012 nova_compute[192063]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.720 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.779 2 INFO nova.virt.node [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Determined node identity ddb6f967-9a8a-4554-9b44-b99536054f9c from /var/lib/nova/compute_id#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.818 2 WARNING nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Compute nodes ['ddb6f967-9a8a-4554-9b44-b99536054f9c'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  2 07:54:05 np0005466012 nova_compute[192063]: 2025-10-02 11:54:05.893 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  2 07:54:06 np0005466012 podman[192364]: 2025-10-02 11:54:06.168051191 +0000 UTC m=+0.085496655 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 07:54:06 np0005466012 nova_compute[192063]: 2025-10-02 11:54:06.715 2 WARNING nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  2 07:54:06 np0005466012 nova_compute[192063]: 2025-10-02 11:54:06.716 2 DEBUG oslo_concurrency.lockutils [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:06 np0005466012 nova_compute[192063]: 2025-10-02 11:54:06.716 2 DEBUG oslo_concurrency.lockutils [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:06 np0005466012 nova_compute[192063]: 2025-10-02 11:54:06.717 2 DEBUG oslo_concurrency.lockutils [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:06 np0005466012 nova_compute[192063]: 2025-10-02 11:54:06.717 2 DEBUG nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:54:06 np0005466012 systemd[1]: Starting libvirt nodedev daemon...
Oct  2 07:54:06 np0005466012 systemd[1]: Started libvirt nodedev daemon.
Oct  2 07:54:07 np0005466012 nova_compute[192063]: 2025-10-02 11:54:07.001 2 WARNING nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:54:07 np0005466012 nova_compute[192063]: 2025-10-02 11:54:07.001 2 DEBUG nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6226MB free_disk=73.66924667358398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:54:07 np0005466012 nova_compute[192063]: 2025-10-02 11:54:07.002 2 DEBUG oslo_concurrency.lockutils [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:54:07 np0005466012 nova_compute[192063]: 2025-10-02 11:54:07.002 2 DEBUG oslo_concurrency.lockutils [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:54:08 np0005466012 nova_compute[192063]: 2025-10-02 11:54:08.747 2 WARNING nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] No compute node record for compute-1.ctlplane.example.com:ddb6f967-9a8a-4554-9b44-b99536054f9c: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ddb6f967-9a8a-4554-9b44-b99536054f9c could not be found.#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.029 2 INFO nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: ddb6f967-9a8a-4554-9b44-b99536054f9c#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.224 2 DEBUG nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.224 2 DEBUG nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.823 2 INFO nova.scheduler.client.report [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [req-db6fc4e0-7a3e-4318-9ea9-af83c908d2eb] Created resource provider record via placement API for resource provider with UUID ddb6f967-9a8a-4554-9b44-b99536054f9c and name compute-1.ctlplane.example.com.#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.852 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  2 07:54:09 np0005466012 nova_compute[192063]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.853 2 INFO nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.853 2 DEBUG nova.compute.provider_tree [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.854 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.856 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Libvirt baseline CPU <cpu>
Oct  2 07:54:09 np0005466012 nova_compute[192063]:  <arch>x86_64</arch>
Oct  2 07:54:09 np0005466012 nova_compute[192063]:  <model>Nehalem</model>
Oct  2 07:54:09 np0005466012 nova_compute[192063]:  <vendor>AMD</vendor>
Oct  2 07:54:09 np0005466012 nova_compute[192063]:  <topology sockets="8" cores="1" threads="1"/>
Oct  2 07:54:09 np0005466012 nova_compute[192063]: </cpu>
Oct  2 07:54:09 np0005466012 nova_compute[192063]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.997 2 DEBUG nova.scheduler.client.report [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Updated inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.997 2 DEBUG nova.compute.provider_tree [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Updating resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 07:54:09 np0005466012 nova_compute[192063]: 2025-10-02 11:54:09.998 2 DEBUG nova.compute.provider_tree [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 07:54:10 np0005466012 podman[192415]: 2025-10-02 11:54:10.128756933 +0000 UTC m=+0.048118267 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:54:10 np0005466012 nova_compute[192063]: 2025-10-02 11:54:10.172 2 DEBUG nova.compute.provider_tree [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Updating resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 07:54:10 np0005466012 nova_compute[192063]: 2025-10-02 11:54:10.224 2 DEBUG nova.compute.resource_tracker [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:54:10 np0005466012 nova_compute[192063]: 2025-10-02 11:54:10.224 2 DEBUG oslo_concurrency.lockutils [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:54:10 np0005466012 nova_compute[192063]: 2025-10-02 11:54:10.224 2 DEBUG nova.service [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  2 07:54:10 np0005466012 nova_compute[192063]: 2025-10-02 11:54:10.364 2 DEBUG nova.service [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  2 07:54:10 np0005466012 nova_compute[192063]: 2025-10-02 11:54:10.365 2 DEBUG nova.servicegroup.drivers.db [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  2 07:54:10 np0005466012 systemd-logind[827]: New session 28 of user zuul.
Oct  2 07:54:10 np0005466012 systemd[1]: Started Session 28 of User zuul.
Oct  2 07:54:12 np0005466012 python3.9[192587]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  2 07:54:13 np0005466012 python3.9[192743]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:54:13 np0005466012 systemd[1]: Reloading.
Oct  2 07:54:14 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:54:14 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:54:15 np0005466012 python3.9[192928]: ansible-ansible.builtin.service_facts Invoked
Oct  2 07:54:15 np0005466012 network[192945]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  2 07:54:15 np0005466012 network[192946]: 'network-scripts' will be removed from distribution in near future.
Oct  2 07:54:15 np0005466012 network[192947]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  2 07:54:19 np0005466012 podman[193196]: 2025-10-02 11:54:19.009037131 +0000 UTC m=+0.069680387 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:54:19 np0005466012 python3.9[193243]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:54:20 np0005466012 python3.9[193399]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:20 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:54:20 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 07:54:21 np0005466012 python3.9[193552]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:22 np0005466012 python3.9[193704]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:54:23 np0005466012 python3.9[193856]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:54:24 np0005466012 python3.9[194008]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:54:24 np0005466012 systemd[1]: Reloading.
Oct  2 07:54:24 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:54:24 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:54:25 np0005466012 python3.9[194195]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:54:25 np0005466012 podman[194197]: 2025-10-02 11:54:25.380677152 +0000 UTC m=+0.085874057 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 07:54:26 np0005466012 python3.9[194369]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:54:26 np0005466012 python3.9[194519]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:54:27 np0005466012 python3.9[194671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:28 np0005466012 python3.9[194792]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406067.163052-364-19010757993588/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:54:29 np0005466012 python3.9[194944]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct  2 07:54:30 np0005466012 python3.9[195096]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct  2 07:54:31 np0005466012 python3.9[195249]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  2 07:54:31 np0005466012 nova_compute[192063]: 2025-10-02 11:54:31.366 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:31 np0005466012 nova_compute[192063]: 2025-10-02 11:54:31.388 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:54:32 np0005466012 python3.9[195407]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  2 07:54:35 np0005466012 auditd[707]: Audit daemon rotating log files
Oct  2 07:54:35 np0005466012 python3.9[195565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:35 np0005466012 python3.9[195686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759406074.918737-568-52121581513386/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:36 np0005466012 podman[195810]: 2025-10-02 11:54:36.347260719 +0000 UTC m=+0.075634576 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  2 07:54:36 np0005466012 python3.9[195851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:37 np0005466012 python3.9[195983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759406076.0649552-568-197672645814203/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:37 np0005466012 python3.9[196133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:38 np0005466012 python3.9[196254]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759406077.2706275-568-177345959590885/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:39 np0005466012 python3.9[196404]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:54:39 np0005466012 python3.9[196556]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:54:40 np0005466012 podman[196682]: 2025-10-02 11:54:40.279067067 +0000 UTC m=+0.064419223 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:54:40 np0005466012 python3.9[196726]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:41 np0005466012 python3.9[196849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406079.9365227-745-186076261157220/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:41 np0005466012 python3.9[196999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:42 np0005466012 python3.9[197075]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:42 np0005466012 python3.9[197225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:43 np0005466012 python3.9[197346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406082.3711545-745-128276611023126/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:44 np0005466012 python3.9[197496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:44 np0005466012 python3.9[197617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406083.5640402-745-5242709772009/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:45 np0005466012 python3.9[197767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:46 np0005466012 python3.9[197888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406084.7398534-745-164066337146069/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:46 np0005466012 python3.9[198038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:47 np0005466012 python3.9[198159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406086.2189064-745-49196677037623/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:47 np0005466012 python3.9[198309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:48 np0005466012 python3.9[198430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406087.4816873-745-189530959327773/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:49 np0005466012 podman[198580]: 2025-10-02 11:54:49.187654829 +0000 UTC m=+0.094263526 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd)
Oct  2 07:54:49 np0005466012 python3.9[198581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:49 np0005466012 python3.9[198721]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406088.7575085-745-534449648452/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:50 np0005466012 python3.9[198871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:51 np0005466012 python3.9[198992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406089.9906478-745-145541659740627/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:51 np0005466012 python3.9[199142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:52 np0005466012 python3.9[199263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406091.193886-745-21641144741070/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:53 np0005466012 python3.9[199413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:53 np0005466012 python3.9[199534]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406092.4948814-745-98870280409265/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:54 np0005466012 python3.9[199684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:55 np0005466012 python3.9[199760]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:55 np0005466012 podman[199884]: 2025-10-02 11:54:55.761971409 +0000 UTC m=+0.056766311 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:54:55 np0005466012 python3.9[199923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:56 np0005466012 python3.9[200006]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:57 np0005466012 python3.9[200156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:54:57 np0005466012 python3.9[200232]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:58 np0005466012 python3.9[200384]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:59 np0005466012 python3.9[200536]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:54:59 np0005466012 python3.9[200688]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:00 np0005466012 python3.9[200840]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:00 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:00 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:00 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:01 np0005466012 systemd[1]: Listening on Podman API Socket.
Oct  2 07:55:01 np0005466012 python3.9[201031]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:55:02.104 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:55:02.104 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:55:02.105 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:02 np0005466012 python3.9[201154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406101.462693-1411-5229457305785/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:02 np0005466012 python3.9[201230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.836 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.837 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.837 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.839 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.839 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.862 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.862 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.862 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:02 np0005466012 nova_compute[192063]: 2025-10-02 11:55:02.862 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.030 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.031 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6120MB free_disk=73.66997146606445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.032 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.032 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.104 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.104 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.176 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.195 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.199 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:55:03 np0005466012 nova_compute[192063]: 2025-10-02 11:55:03.200 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:55:03 np0005466012 python3.9[201353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406101.462693-1411-5229457305785/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:04 np0005466012 python3.9[201505]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct  2 07:55:05 np0005466012 python3.9[201657]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:06 np0005466012 podman[201781]: 2025-10-02 11:55:06.720528125 +0000 UTC m=+0.084958687 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:55:06 np0005466012 python3[201828]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:07 np0005466012 podman[201871]: 2025-10-02 11:55:07.122875555 +0000 UTC m=+0.024078664 image pull 5f0622bc7c13827171d93b3baf72157e23d24d44579ad79fe3a89ad88180a4bb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  2 07:55:07 np0005466012 podman[201871]: 2025-10-02 11:55:07.646805857 +0000 UTC m=+0.548008916 container create 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 07:55:07 np0005466012 python3[201828]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Oct  2 07:55:08 np0005466012 python3.9[202061]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:09 np0005466012 python3.9[202215]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:10 np0005466012 python3.9[202366]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406109.3357043-1603-162476883272179/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:10 np0005466012 podman[202414]: 2025-10-02 11:55:10.638578591 +0000 UTC m=+0.050252474 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:55:10 np0005466012 python3.9[202462]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:10 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:11 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:11 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:11 np0005466012 python3.9[202574]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:11 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:12 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:12 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:12 np0005466012 systemd[1]: Starting ceilometer_agent_compute container...
Oct  2 07:55:12 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:12 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:12 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.
Oct  2 07:55:13 np0005466012 podman[202614]: 2025-10-02 11:55:13.093108856 +0000 UTC m=+0.569691485 container init 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + sudo -E kolla_set_configs
Oct  2 07:55:13 np0005466012 podman[202614]: 2025-10-02 11:55:13.118266732 +0000 UTC m=+0.594849341 container start 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Validating config file
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Copying service configuration files
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: INFO:__main__:Writing out command to execute
Oct  2 07:55:13 np0005466012 podman[202614]: ceilometer_agent_compute
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: ++ cat /run_command
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + ARGS=
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + sudo kolla_copy_cacerts
Oct  2 07:55:13 np0005466012 systemd[1]: Started ceilometer_agent_compute container.
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + [[ ! -n '' ]]
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + . kolla_extend_start
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + umask 0022
Oct  2 07:55:13 np0005466012 ceilometer_agent_compute[202629]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  2 07:55:13 np0005466012 podman[202635]: 2025-10-02 11:55:13.262362911 +0000 UTC m=+0.137038690 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 07:55:13 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-38671f9def629e19.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:13 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-38671f9def629e19.service: Failed with result 'exit-code'.
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.102 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.103 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.104 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.104 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.105 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.105 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.105 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.105 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.106 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.106 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.106 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.106 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.107 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.107 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.107 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.107 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.108 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.108 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.108 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.109 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.109 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.109 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.109 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.110 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.110 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.110 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.110 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.110 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.111 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.111 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.111 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.111 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.111 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.112 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.112 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.112 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.112 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.112 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.113 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.113 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.113 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.113 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.114 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.114 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.114 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.114 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.115 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.115 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.115 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.115 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.115 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.116 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.116 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.116 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.116 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.117 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.117 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.117 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.117 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.117 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.118 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.118 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.118 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.118 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.118 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.119 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.119 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.119 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.119 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.120 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.120 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.120 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.120 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.120 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.121 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.121 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.121 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.121 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.121 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.122 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.122 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.122 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.122 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.122 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.123 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.123 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.123 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.123 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.123 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.124 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.124 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.124 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.124 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.124 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.125 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.125 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.125 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.125 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.126 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.126 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.126 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.126 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.127 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.127 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.127 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.128 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.128 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.128 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.128 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.129 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.130 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.130 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.130 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.130 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.131 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.131 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.131 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.131 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.132 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.132 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.132 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.133 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.133 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.133 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.136 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.136 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.136 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.137 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.137 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.137 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.137 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.137 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.137 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.138 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.138 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.138 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.138 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.138 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.139 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.139 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.139 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.139 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.139 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.139 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.140 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.140 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.140 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.140 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.140 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.141 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.141 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.141 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.141 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.141 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.141 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.162 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.164 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.166 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:14 np0005466012 python3.9[202811]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.284 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:14 np0005466012 systemd[1]: Stopping ceilometer_agent_compute container...
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.366 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.368 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.368 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.368 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.368 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.368 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.368 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.369 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.370 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.371 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.372 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.373 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.374 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.375 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.376 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.377 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.378 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.379 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.380 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.381 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.382 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.383 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.384 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.387 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.388 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.391 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.395 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.430 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.531 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.532 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.533 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Oct  2 07:55:14 np0005466012 virtqemud[191783]: End of file while reading data: Input/output error
Oct  2 07:55:14 np0005466012 ceilometer_agent_compute[202629]: 2025-10-02 11:55:14.551 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Oct  2 07:55:14 np0005466012 virtqemud[191783]: End of file while reading data: Input/output error
Oct  2 07:55:14 np0005466012 systemd[1]: libpod-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.scope: Deactivated successfully.
Oct  2 07:55:14 np0005466012 systemd[1]: libpod-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.scope: Consumed 1.473s CPU time.
Oct  2 07:55:14 np0005466012 podman[202818]: 2025-10-02 11:55:14.719550671 +0000 UTC m=+0.397365679 container died 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  2 07:55:14 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-38671f9def629e19.timer: Deactivated successfully.
Oct  2 07:55:14 np0005466012 systemd[1]: Stopped /usr/bin/podman healthcheck run 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.
Oct  2 07:55:15 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:15 np0005466012 systemd[1]: var-lib-containers-storage-overlay-12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e-merged.mount: Deactivated successfully.
Oct  2 07:55:15 np0005466012 podman[202818]: 2025-10-02 11:55:15.249247453 +0000 UTC m=+0.927062501 container cleanup 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm)
Oct  2 07:55:15 np0005466012 podman[202818]: ceilometer_agent_compute
Oct  2 07:55:15 np0005466012 podman[202850]: ceilometer_agent_compute
Oct  2 07:55:15 np0005466012 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct  2 07:55:15 np0005466012 systemd[1]: Stopped ceilometer_agent_compute container.
Oct  2 07:55:15 np0005466012 systemd[1]: Starting ceilometer_agent_compute container...
Oct  2 07:55:15 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:15 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f7cd1e7f688b28b9d1551dbf9e595d87dd3af3310f4a01f3cc1016a02d063e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:15 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.
Oct  2 07:55:15 np0005466012 podman[202863]: 2025-10-02 11:55:15.753382448 +0000 UTC m=+0.397921472 container init 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + sudo -E kolla_set_configs
Oct  2 07:55:15 np0005466012 podman[202863]: 2025-10-02 11:55:15.781283867 +0000 UTC m=+0.425822861 container start 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Validating config file
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Copying service configuration files
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: INFO:__main__:Writing out command to execute
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: ++ cat /run_command
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + ARGS=
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + sudo kolla_copy_cacerts
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: sudo: unable to send audit message: Operation not permitted
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + [[ ! -n '' ]]
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + . kolla_extend_start
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + umask 0022
Oct  2 07:55:15 np0005466012 ceilometer_agent_compute[202878]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  2 07:55:15 np0005466012 podman[202863]: ceilometer_agent_compute
Oct  2 07:55:15 np0005466012 systemd[1]: Started ceilometer_agent_compute container.
Oct  2 07:55:16 np0005466012 podman[202886]: 2025-10-02 11:55:16.043345015 +0000 UTC m=+0.249319152 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct  2 07:55:16 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-2891faf48a2811de.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:16 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-2891faf48a2811de.service: Failed with result 'exit-code'.
Oct  2 07:55:16 np0005466012 python3.9[203062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.722 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.722 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.722 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.722 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.723 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.724 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.725 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.726 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.727 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.728 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.729 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.730 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.731 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.732 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.733 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.734 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.735 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.736 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.737 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.737 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.737 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.754 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.755 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.756 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.767 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.885 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.885 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.885 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.885 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.886 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.887 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.888 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.889 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.890 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.891 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.892 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.893 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.894 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.895 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.896 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.897 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.898 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.899 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.900 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.901 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.902 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.903 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.904 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.905 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.906 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.907 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.907 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.907 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.907 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.908 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.913 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:55:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:55:17 np0005466012 python3.9[203191]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406116.2298133-1699-161281699765650/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:18 np0005466012 python3.9[203343]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct  2 07:55:19 np0005466012 python3.9[203495]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:20 np0005466012 podman[203595]: 2025-10-02 11:55:20.154465384 +0000 UTC m=+0.061451987 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:55:20 np0005466012 python3[203666]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:20 np0005466012 podman[203701]: 2025-10-02 11:55:20.6395722 +0000 UTC m=+0.023441750 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  2 07:55:21 np0005466012 podman[203701]: 2025-10-02 11:55:21.437679592 +0000 UTC m=+0.821549122 container create a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:55:21 np0005466012 python3[203666]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct  2 07:55:22 np0005466012 python3.9[203891]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:23 np0005466012 python3.9[204045]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:23 np0005466012 python3.9[204196]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406123.1386697-1858-228468053684794/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:24 np0005466012 python3.9[204272]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:24 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:24 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:24 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:25 np0005466012 python3.9[204382]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:25 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:25 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:25 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:25 np0005466012 systemd[1]: Starting node_exporter container...
Oct  2 07:55:26 np0005466012 podman[204420]: 2025-10-02 11:55:26.08141387 +0000 UTC m=+0.167367484 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:55:26 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:26 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee6a06a401e13b0115ae05c78dd2ef79fec613cd6a62533fe23d9630c129df4c/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:26 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee6a06a401e13b0115ae05c78dd2ef79fec613cd6a62533fe23d9630c129df4c/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:26 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.
Oct  2 07:55:26 np0005466012 podman[204422]: 2025-10-02 11:55:26.173650859 +0000 UTC m=+0.260202237 container init a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.186Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.187Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.187Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.187Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.187Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.187Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=arp
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=bcache
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=bonding
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=cpu
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=edac
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=filefd
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=netclass
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=netdev
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=netstat
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=nfs
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=nvme
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=softnet
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=systemd
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=xfs
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.188Z caller=node_exporter.go:117 level=info collector=zfs
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.189Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  2 07:55:26 np0005466012 node_exporter[204455]: ts=2025-10-02T11:55:26.189Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  2 07:55:26 np0005466012 podman[204422]: 2025-10-02 11:55:26.202298005 +0000 UTC m=+0.288849343 container start a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:55:26 np0005466012 podman[204422]: node_exporter
Oct  2 07:55:26 np0005466012 systemd[1]: Started node_exporter container.
Oct  2 07:55:26 np0005466012 podman[204465]: 2025-10-02 11:55:26.47264149 +0000 UTC m=+0.258249224 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 07:55:27 np0005466012 python3.9[204641]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:27 np0005466012 systemd[1]: Stopping node_exporter container...
Oct  2 07:55:27 np0005466012 systemd[1]: libpod-a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.scope: Deactivated successfully.
Oct  2 07:55:27 np0005466012 podman[204645]: 2025-10-02 11:55:27.288525163 +0000 UTC m=+0.041204110 container died a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 07:55:27 np0005466012 systemd[1]: a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798-14e3f5c2f5409b3f.timer: Deactivated successfully.
Oct  2 07:55:27 np0005466012 systemd[1]: Stopped /usr/bin/podman healthcheck run a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.
Oct  2 07:55:27 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:27 np0005466012 systemd[1]: var-lib-containers-storage-overlay-ee6a06a401e13b0115ae05c78dd2ef79fec613cd6a62533fe23d9630c129df4c-merged.mount: Deactivated successfully.
Oct  2 07:55:27 np0005466012 podman[204645]: 2025-10-02 11:55:27.325765622 +0000 UTC m=+0.078444579 container cleanup a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:55:27 np0005466012 podman[204645]: node_exporter
Oct  2 07:55:27 np0005466012 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  2 07:55:27 np0005466012 podman[204672]: node_exporter
Oct  2 07:55:27 np0005466012 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct  2 07:55:27 np0005466012 systemd[1]: Stopped node_exporter container.
Oct  2 07:55:27 np0005466012 systemd[1]: Starting node_exporter container...
Oct  2 07:55:27 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:27 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee6a06a401e13b0115ae05c78dd2ef79fec613cd6a62533fe23d9630c129df4c/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:27 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee6a06a401e13b0115ae05c78dd2ef79fec613cd6a62533fe23d9630c129df4c/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:27 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.
Oct  2 07:55:27 np0005466012 podman[204685]: 2025-10-02 11:55:27.513206629 +0000 UTC m=+0.098566194 container init a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.525Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=arp
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=bcache
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=bonding
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=cpu
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=edac
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=filefd
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=netclass
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=netdev
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=netstat
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=nfs
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=nvme
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=softnet
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=systemd
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=xfs
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.526Z caller=node_exporter.go:117 level=info collector=zfs
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.527Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  2 07:55:27 np0005466012 node_exporter[204700]: ts=2025-10-02T11:55:27.527Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  2 07:55:27 np0005466012 podman[204685]: 2025-10-02 11:55:27.547778495 +0000 UTC m=+0.133138050 container start a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:55:27 np0005466012 podman[204685]: node_exporter
Oct  2 07:55:27 np0005466012 systemd[1]: Started node_exporter container.
Oct  2 07:55:27 np0005466012 podman[204709]: 2025-10-02 11:55:27.611550637 +0000 UTC m=+0.054549750 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:55:28 np0005466012 python3.9[204884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:28 np0005466012 python3.9[205007]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406127.7982879-1954-22053400453742/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:29 np0005466012 python3.9[205159]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct  2 07:55:30 np0005466012 python3.9[205311]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:31 np0005466012 python3[205463]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:32 np0005466012 podman[205476]: 2025-10-02 11:55:32.964495753 +0000 UTC m=+1.220880373 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  2 07:55:33 np0005466012 podman[205574]: 2025-10-02 11:55:33.17403664 +0000 UTC m=+0.093898142 container create cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:55:33 np0005466012 podman[205574]: 2025-10-02 11:55:33.103589287 +0000 UTC m=+0.023450809 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  2 07:55:33 np0005466012 python3[205463]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct  2 07:55:34 np0005466012 python3.9[205764]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:34 np0005466012 python3.9[205918]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:35 np0005466012 python3.9[206069]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406134.8086739-2113-69937810090242/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:36 np0005466012 python3.9[206145]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:36 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:36 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:36 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:36 np0005466012 python3.9[206256]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:37 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:37 np0005466012 podman[206258]: 2025-10-02 11:55:37.097743875 +0000 UTC m=+0.092940058 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Oct  2 07:55:37 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:37 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:37 np0005466012 systemd[1]: Starting podman_exporter container...
Oct  2 07:55:37 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:37 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8ea2503542718004d3603654447868b32b9c7b723342dde9fce41ddf5630b60/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:37 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8ea2503542718004d3603654447868b32b9c7b723342dde9fce41ddf5630b60/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:37 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.
Oct  2 07:55:37 np0005466012 podman[206321]: 2025-10-02 11:55:37.469395872 +0000 UTC m=+0.121529329 container init cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.483Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.484Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.484Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.484Z caller=handler.go:105 level=info collector=container
Oct  2 07:55:37 np0005466012 podman[206321]: 2025-10-02 11:55:37.503321283 +0000 UTC m=+0.155454780 container start cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:55:37 np0005466012 podman[206321]: podman_exporter
Oct  2 07:55:37 np0005466012 systemd[1]: Starting Podman API Service...
Oct  2 07:55:37 np0005466012 systemd[1]: Started Podman API Service.
Oct  2 07:55:37 np0005466012 systemd[1]: Started podman_exporter container.
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="Setting parallel job count to 25"
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="Using sqlite as database backend"
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct  2 07:55:37 np0005466012 podman[206348]: @ - - [02/Oct/2025:11:55:37 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  2 07:55:37 np0005466012 podman[206348]: time="2025-10-02T11:55:37Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  2 07:55:37 np0005466012 podman[206345]: 2025-10-02 11:55:37.570388204 +0000 UTC m=+0.055238737 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:55:37 np0005466012 systemd[1]: cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36-263b4924cde7be98.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:37 np0005466012 systemd[1]: cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36-263b4924cde7be98.service: Failed with result 'exit-code'.
Oct  2 07:55:37 np0005466012 podman[206348]: @ - - [02/Oct/2025:11:55:37 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22059 "" "Go-http-client/1.1"
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.587Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.587Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  2 07:55:37 np0005466012 podman_exporter[206336]: ts=2025-10-02T11:55:37.588Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  2 07:55:38 np0005466012 python3.9[206535]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:38 np0005466012 systemd[1]: Stopping podman_exporter container...
Oct  2 07:55:38 np0005466012 podman[206348]: @ - - [02/Oct/2025:11:55:37 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct  2 07:55:38 np0005466012 systemd[1]: libpod-cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.scope: Deactivated successfully.
Oct  2 07:55:38 np0005466012 podman[206539]: 2025-10-02 11:55:38.533007381 +0000 UTC m=+0.060328470 container died cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:55:38 np0005466012 systemd[1]: cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36-263b4924cde7be98.timer: Deactivated successfully.
Oct  2 07:55:38 np0005466012 systemd[1]: Stopped /usr/bin/podman healthcheck run cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.
Oct  2 07:55:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay-b8ea2503542718004d3603654447868b32b9c7b723342dde9fce41ddf5630b60-merged.mount: Deactivated successfully.
Oct  2 07:55:39 np0005466012 podman[206539]: 2025-10-02 11:55:39.004659715 +0000 UTC m=+0.531980814 container cleanup cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:55:39 np0005466012 podman[206539]: podman_exporter
Oct  2 07:55:39 np0005466012 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  2 07:55:39 np0005466012 podman[206568]: podman_exporter
Oct  2 07:55:39 np0005466012 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct  2 07:55:39 np0005466012 systemd[1]: Stopped podman_exporter container.
Oct  2 07:55:39 np0005466012 systemd[1]: Starting podman_exporter container...
Oct  2 07:55:39 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:39 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8ea2503542718004d3603654447868b32b9c7b723342dde9fce41ddf5630b60/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:39 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8ea2503542718004d3603654447868b32b9c7b723342dde9fce41ddf5630b60/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:39 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.
Oct  2 07:55:39 np0005466012 podman[206581]: 2025-10-02 11:55:39.232432162 +0000 UTC m=+0.133178940 container init cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.249Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.249Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.249Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  2 07:55:39 np0005466012 podman[206348]: @ - - [02/Oct/2025:11:55:39 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.249Z caller=handler.go:105 level=info collector=container
Oct  2 07:55:39 np0005466012 podman[206348]: time="2025-10-02T11:55:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  2 07:55:39 np0005466012 podman[206581]: 2025-10-02 11:55:39.257440037 +0000 UTC m=+0.158186765 container start cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:55:39 np0005466012 podman[206581]: podman_exporter
Oct  2 07:55:39 np0005466012 systemd[1]: Started podman_exporter container.
Oct  2 07:55:39 np0005466012 podman[206348]: @ - - [02/Oct/2025:11:55:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22061 "" "Go-http-client/1.1"
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.277Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.277Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  2 07:55:39 np0005466012 podman_exporter[206596]: ts=2025-10-02T11:55:39.281Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  2 07:55:39 np0005466012 podman[206606]: 2025-10-02 11:55:39.326293433 +0000 UTC m=+0.059858639 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:55:39 np0005466012 python3.9[206785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:55:40 np0005466012 python3.9[206908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759406139.497533-2209-58437111645689/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  2 07:55:41 np0005466012 podman[207008]: 2025-10-02 11:55:41.175729182 +0000 UTC m=+0.088316786 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 07:55:41 np0005466012 python3.9[207080]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct  2 07:55:42 np0005466012 python3.9[207232]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  2 07:55:43 np0005466012 python3[207384]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  2 07:55:45 np0005466012 podman[207397]: 2025-10-02 11:55:45.616803858 +0000 UTC m=+2.417470686 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  2 07:55:45 np0005466012 podman[207495]: 2025-10-02 11:55:45.736019051 +0000 UTC m=+0.042004106 container create a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, name=ubi9-minimal, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Oct  2 07:55:45 np0005466012 podman[207495]: 2025-10-02 11:55:45.712737639 +0000 UTC m=+0.018722694 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  2 07:55:45 np0005466012 python3[207384]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  2 07:55:46 np0005466012 podman[207559]: 2025-10-02 11:55:46.135942572 +0000 UTC m=+0.052611644 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  2 07:55:46 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-2891faf48a2811de.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:55:46 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-2891faf48a2811de.service: Failed with result 'exit-code'.
Oct  2 07:55:46 np0005466012 python3.9[207703]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:55:47 np0005466012 python3.9[207857]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:48 np0005466012 python3.9[208008]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759406147.4789712-2368-93992496422329/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:55:48 np0005466012 python3.9[208084]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  2 07:55:48 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:48 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:48 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:49 np0005466012 python3.9[208194]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  2 07:55:49 np0005466012 systemd[1]: Reloading.
Oct  2 07:55:49 np0005466012 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  2 07:55:49 np0005466012 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  2 07:55:49 np0005466012 systemd[1]: Starting openstack_network_exporter container...
Oct  2 07:55:49 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:49 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:49 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:49 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:49 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.
Oct  2 07:55:49 np0005466012 podman[208234]: 2025-10-02 11:55:49.95929157 +0000 UTC m=+0.138366326 container init a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64)
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *bridge.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *coverage.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *datapath.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *iface.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *memory.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *ovnnorthd.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *ovn.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *ovsdbserver.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *pmd_perf.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *pmd_rxq.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: INFO    11:55:49 main.go:48: registering *vswitch.Collector
Oct  2 07:55:49 np0005466012 openstack_network_exporter[208249]: NOTICE  11:55:49 main.go:76: listening on https://:9105/metrics
Oct  2 07:55:49 np0005466012 podman[208234]: 2025-10-02 11:55:49.991836008 +0000 UTC m=+0.170910764 container start a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, version=9.6, architecture=x86_64)
Oct  2 07:55:49 np0005466012 podman[208234]: openstack_network_exporter
Oct  2 07:55:50 np0005466012 systemd[1]: Started openstack_network_exporter container.
Oct  2 07:55:50 np0005466012 podman[208259]: 2025-10-02 11:55:50.082812608 +0000 UTC m=+0.082353333 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 07:55:50 np0005466012 podman[208406]: 2025-10-02 11:55:50.510069889 +0000 UTC m=+0.051421354 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  2 07:55:50 np0005466012 python3.9[208454]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  2 07:55:50 np0005466012 systemd[1]: Stopping openstack_network_exporter container...
Oct  2 07:55:50 np0005466012 systemd[1]: libpod-a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.scope: Deactivated successfully.
Oct  2 07:55:50 np0005466012 podman[208458]: 2025-10-02 11:55:50.914411655 +0000 UTC m=+0.043012710 container died a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible)
Oct  2 07:55:50 np0005466012 systemd[1]: a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410-59611440963193e9.timer: Deactivated successfully.
Oct  2 07:55:50 np0005466012 systemd[1]: Stopped /usr/bin/podman healthcheck run a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.
Oct  2 07:55:50 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410-userdata-shm.mount: Deactivated successfully.
Oct  2 07:55:50 np0005466012 systemd[1]: var-lib-containers-storage-overlay-b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea-merged.mount: Deactivated successfully.
Oct  2 07:55:52 np0005466012 podman[208458]: 2025-10-02 11:55:52.265239019 +0000 UTC m=+1.393840074 container cleanup a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Oct  2 07:55:52 np0005466012 podman[208458]: openstack_network_exporter
Oct  2 07:55:52 np0005466012 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  2 07:55:52 np0005466012 podman[208486]: openstack_network_exporter
Oct  2 07:55:52 np0005466012 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct  2 07:55:52 np0005466012 systemd[1]: Stopped openstack_network_exporter container.
Oct  2 07:55:52 np0005466012 systemd[1]: Starting openstack_network_exporter container...
Oct  2 07:55:52 np0005466012 systemd[1]: Started libcrun container.
Oct  2 07:55:52 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:52 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:52 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d61fdb110c64d1bc078f298b300aa4e169bafb626233e85f2e7133eaa811ea/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  2 07:55:52 np0005466012 systemd[1]: Started /usr/bin/podman healthcheck run a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.
Oct  2 07:55:52 np0005466012 podman[208499]: 2025-10-02 11:55:52.479189222 +0000 UTC m=+0.127655408 container init a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible)
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *bridge.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *coverage.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *datapath.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *iface.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *memory.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *ovnnorthd.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *ovn.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *ovsdbserver.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *pmd_perf.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *pmd_rxq.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: INFO    11:55:52 main.go:48: registering *vswitch.Collector
Oct  2 07:55:52 np0005466012 openstack_network_exporter[208515]: NOTICE  11:55:52 main.go:76: listening on https://:9105/metrics
Oct  2 07:55:52 np0005466012 podman[208499]: 2025-10-02 11:55:52.508190263 +0000 UTC m=+0.156656429 container start a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public)
Oct  2 07:55:52 np0005466012 podman[208499]: openstack_network_exporter
Oct  2 07:55:52 np0005466012 systemd[1]: Started openstack_network_exporter container.
Oct  2 07:55:52 np0005466012 podman[208525]: 2025-10-02 11:55:52.569499556 +0000 UTC m=+0.052139071 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, distribution-scope=public, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 07:55:53 np0005466012 python3.9[208697]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  2 07:55:57 np0005466012 podman[208722]: 2025-10-02 11:55:57.141712143 +0000 UTC m=+0.062540184 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 07:55:58 np0005466012 podman[208742]: 2025-10-02 11:55:58.139852469 +0000 UTC m=+0.053340181 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:56:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:56:02.105 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:56:02.105 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:56:02.105 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:03 np0005466012 nova_compute[192063]: 2025-10-02 11:56:03.193 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466012 nova_compute[192063]: 2025-10-02 11:56:03.194 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466012 nova_compute[192063]: 2025-10-02 11:56:03.212 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466012 nova_compute[192063]: 2025-10-02 11:56:03.212 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466012 nova_compute[192063]: 2025-10-02 11:56:03.212 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:03 np0005466012 nova_compute[192063]: 2025-10-02 11:56:03.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.835 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.835 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.835 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.836 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.836 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.857 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.857 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.858 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.858 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.994 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.995 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5910MB free_disk=73.5044059753418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.995 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:56:04 np0005466012 nova_compute[192063]: 2025-10-02 11:56:04.995 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:56:05 np0005466012 nova_compute[192063]: 2025-10-02 11:56:05.068 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:56:05 np0005466012 nova_compute[192063]: 2025-10-02 11:56:05.069 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:56:05 np0005466012 nova_compute[192063]: 2025-10-02 11:56:05.100 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:56:05 np0005466012 nova_compute[192063]: 2025-10-02 11:56:05.114 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:56:05 np0005466012 nova_compute[192063]: 2025-10-02 11:56:05.115 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:56:05 np0005466012 nova_compute[192063]: 2025-10-02 11:56:05.115 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:56:08 np0005466012 podman[208767]: 2025-10-02 11:56:08.166116627 +0000 UTC m=+0.079339490 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:56:10 np0005466012 podman[208793]: 2025-10-02 11:56:10.134454072 +0000 UTC m=+0.054660782 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:56:12 np0005466012 podman[208817]: 2025-10-02 11:56:12.143496222 +0000 UTC m=+0.062887032 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:56:17 np0005466012 podman[208837]: 2025-10-02 11:56:17.124466083 +0000 UTC m=+0.045228355 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 07:56:17 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-2891faf48a2811de.service: Main process exited, code=exited, status=1/FAILURE
Oct  2 07:56:17 np0005466012 systemd[1]: 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0-2891faf48a2811de.service: Failed with result 'exit-code'.
Oct  2 07:56:21 np0005466012 podman[208856]: 2025-10-02 11:56:21.143364771 +0000 UTC m=+0.065099505 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:56:23 np0005466012 podman[208877]: 2025-10-02 11:56:23.135777838 +0000 UTC m=+0.057832349 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Oct  2 07:56:26 np0005466012 python3.9[209026]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct  2 07:56:27 np0005466012 python3.9[209191]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:27 np0005466012 systemd[1]: Started libpod-conmon-fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502.scope.
Oct  2 07:56:27 np0005466012 podman[209192]: 2025-10-02 11:56:27.126491844 +0000 UTC m=+0.080250641 container exec fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 07:56:27 np0005466012 podman[209192]: 2025-10-02 11:56:27.158211831 +0000 UTC m=+0.111970608 container exec_died fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 07:56:27 np0005466012 systemd[1]: libpod-conmon-fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502.scope: Deactivated successfully.
Oct  2 07:56:27 np0005466012 podman[209223]: 2025-10-02 11:56:27.269659526 +0000 UTC m=+0.058675160 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 07:56:27 np0005466012 python3.9[209395]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:27 np0005466012 systemd[1]: Started libpod-conmon-fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502.scope.
Oct  2 07:56:27 np0005466012 podman[209396]: 2025-10-02 11:56:27.91246233 +0000 UTC m=+0.073728075 container exec fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 07:56:27 np0005466012 podman[209396]: 2025-10-02 11:56:27.946078873 +0000 UTC m=+0.107344608 container exec_died fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:56:27 np0005466012 systemd[1]: libpod-conmon-fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502.scope: Deactivated successfully.
Oct  2 07:56:28 np0005466012 podman[209551]: 2025-10-02 11:56:28.429552673 +0000 UTC m=+0.060959056 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:56:28 np0005466012 python3.9[209596]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:29 np0005466012 python3.9[209755]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct  2 07:56:30 np0005466012 python3.9[209920]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:30 np0005466012 systemd[1]: Started libpod-conmon-9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf.scope.
Oct  2 07:56:30 np0005466012 podman[209921]: 2025-10-02 11:56:30.354909048 +0000 UTC m=+0.086554254 container exec 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 07:56:30 np0005466012 podman[209921]: 2025-10-02 11:56:30.385745853 +0000 UTC m=+0.117391039 container exec_died 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:56:30 np0005466012 systemd[1]: libpod-conmon-9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf.scope: Deactivated successfully.
Oct  2 07:56:31 np0005466012 python3.9[210103]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:31 np0005466012 systemd[1]: Started libpod-conmon-9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf.scope.
Oct  2 07:56:31 np0005466012 podman[210104]: 2025-10-02 11:56:31.130926132 +0000 UTC m=+0.077107185 container exec 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 07:56:31 np0005466012 podman[210104]: 2025-10-02 11:56:31.164046823 +0000 UTC m=+0.110227836 container exec_died 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:56:31 np0005466012 systemd[1]: libpod-conmon-9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf.scope: Deactivated successfully.
Oct  2 07:56:31 np0005466012 python3.9[210289]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:32 np0005466012 python3.9[210441]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct  2 07:56:33 np0005466012 python3.9[210607]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:33 np0005466012 systemd[1]: Started libpod-conmon-600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf.scope.
Oct  2 07:56:33 np0005466012 podman[210608]: 2025-10-02 11:56:33.473509474 +0000 UTC m=+0.175728726 container exec 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:56:33 np0005466012 podman[210628]: 2025-10-02 11:56:33.672978789 +0000 UTC m=+0.185526080 container exec_died 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS)
Oct  2 07:56:33 np0005466012 podman[210608]: 2025-10-02 11:56:33.69561698 +0000 UTC m=+0.397836152 container exec_died 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:56:33 np0005466012 systemd[1]: libpod-conmon-600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf.scope: Deactivated successfully.
Oct  2 07:56:34 np0005466012 python3.9[210793]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:34 np0005466012 systemd[1]: Started libpod-conmon-600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf.scope.
Oct  2 07:56:35 np0005466012 podman[210794]: 2025-10-02 11:56:35.017239394 +0000 UTC m=+0.362206991 container exec 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 07:56:35 np0005466012 podman[210814]: 2025-10-02 11:56:35.09897822 +0000 UTC m=+0.065982412 container exec_died 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:56:35 np0005466012 podman[210794]: 2025-10-02 11:56:35.185159728 +0000 UTC m=+0.530127305 container exec_died 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  2 07:56:35 np0005466012 systemd[1]: libpod-conmon-600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf.scope: Deactivated successfully.
Oct  2 07:56:36 np0005466012 python3.9[210978]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:37 np0005466012 python3.9[211130]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct  2 07:56:37 np0005466012 python3.9[211295]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:38 np0005466012 systemd[1]: Started libpod-conmon-6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.scope.
Oct  2 07:56:38 np0005466012 podman[211296]: 2025-10-02 11:56:38.143403709 +0000 UTC m=+0.275247103 container exec 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Oct  2 07:56:38 np0005466012 podman[211296]: 2025-10-02 11:56:38.332141234 +0000 UTC m=+0.463984588 container exec_died 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 07:56:38 np0005466012 systemd[1]: libpod-conmon-6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.scope: Deactivated successfully.
Oct  2 07:56:38 np0005466012 podman[211327]: 2025-10-02 11:56:38.584140048 +0000 UTC m=+0.113747763 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller)
Oct  2 07:56:39 np0005466012 python3.9[211504]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:39 np0005466012 systemd[1]: Started libpod-conmon-6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.scope.
Oct  2 07:56:39 np0005466012 podman[211505]: 2025-10-02 11:56:39.234125337 +0000 UTC m=+0.114096512 container exec 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 07:56:39 np0005466012 podman[211505]: 2025-10-02 11:56:39.266581214 +0000 UTC m=+0.146552359 container exec_died 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:56:39 np0005466012 systemd[1]: libpod-conmon-6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4.scope: Deactivated successfully.
Oct  2 07:56:40 np0005466012 python3.9[211688]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:40 np0005466012 podman[211812]: 2025-10-02 11:56:40.522532033 +0000 UTC m=+0.059820505 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:56:40 np0005466012 python3.9[211864]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct  2 07:56:41 np0005466012 python3.9[212029]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:41 np0005466012 systemd[1]: Started libpod-conmon-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.scope.
Oct  2 07:56:41 np0005466012 podman[212030]: 2025-10-02 11:56:41.56873193 +0000 UTC m=+0.080498634 container exec 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 07:56:41 np0005466012 podman[212030]: 2025-10-02 11:56:41.604138446 +0000 UTC m=+0.115905100 container exec_died 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:56:41 np0005466012 systemd[1]: libpod-conmon-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.scope: Deactivated successfully.
Oct  2 07:56:42 np0005466012 python3.9[212212]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:42 np0005466012 systemd[1]: Started libpod-conmon-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.scope.
Oct  2 07:56:42 np0005466012 podman[212213]: 2025-10-02 11:56:42.404802534 +0000 UTC m=+0.073600888 container exec 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm)
Oct  2 07:56:42 np0005466012 podman[212213]: 2025-10-02 11:56:42.433361845 +0000 UTC m=+0.102160179 container exec_died 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 07:56:42 np0005466012 systemd[1]: libpod-conmon-6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0.scope: Deactivated successfully.
Oct  2 07:56:42 np0005466012 podman[212229]: 2025-10-02 11:56:42.465821171 +0000 UTC m=+0.062710704 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 07:56:43 np0005466012 python3.9[212412]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:43 np0005466012 python3.9[212564]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct  2 07:56:44 np0005466012 python3.9[212729]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:44 np0005466012 systemd[1]: Started libpod-conmon-a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.scope.
Oct  2 07:56:44 np0005466012 podman[212730]: 2025-10-02 11:56:44.566636143 +0000 UTC m=+0.067727941 container exec a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 07:56:44 np0005466012 podman[212730]: 2025-10-02 11:56:44.596659083 +0000 UTC m=+0.097750871 container exec_died a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:56:44 np0005466012 systemd[1]: libpod-conmon-a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.scope: Deactivated successfully.
Oct  2 07:56:45 np0005466012 python3.9[212912]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:45 np0005466012 systemd[1]: Started libpod-conmon-a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.scope.
Oct  2 07:56:45 np0005466012 podman[212913]: 2025-10-02 11:56:45.312283384 +0000 UTC m=+0.062692214 container exec a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:56:45 np0005466012 podman[212913]: 2025-10-02 11:56:45.342356406 +0000 UTC m=+0.092765246 container exec_died a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 07:56:45 np0005466012 systemd[1]: libpod-conmon-a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798.scope: Deactivated successfully.
Oct  2 07:56:45 np0005466012 python3.9[213097]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:46 np0005466012 python3.9[213249]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct  2 07:56:47 np0005466012 podman[213386]: 2025-10-02 11:56:47.25182463 +0000 UTC m=+0.066402994 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:56:47 np0005466012 python3.9[213434]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:47 np0005466012 systemd[1]: Started libpod-conmon-cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.scope.
Oct  2 07:56:47 np0005466012 podman[213435]: 2025-10-02 11:56:47.529679862 +0000 UTC m=+0.079374264 container exec cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:56:47 np0005466012 podman[213435]: 2025-10-02 11:56:47.558659805 +0000 UTC m=+0.108354187 container exec_died cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:56:47 np0005466012 systemd[1]: libpod-conmon-cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.scope: Deactivated successfully.
Oct  2 07:56:48 np0005466012 python3.9[213616]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:48 np0005466012 systemd[1]: Started libpod-conmon-cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.scope.
Oct  2 07:56:48 np0005466012 podman[213617]: 2025-10-02 11:56:48.291677825 +0000 UTC m=+0.089015234 container exec cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:56:48 np0005466012 podman[213617]: 2025-10-02 11:56:48.32332065 +0000 UTC m=+0.120658079 container exec_died cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:56:48 np0005466012 systemd[1]: libpod-conmon-cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36.scope: Deactivated successfully.
Oct  2 07:56:48 np0005466012 python3.9[213799]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:49 np0005466012 python3.9[213951]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct  2 07:56:50 np0005466012 python3.9[214117]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:50 np0005466012 systemd[1]: Started libpod-conmon-a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.scope.
Oct  2 07:56:50 np0005466012 podman[214118]: 2025-10-02 11:56:50.410475252 +0000 UTC m=+0.077185125 container exec a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Oct  2 07:56:50 np0005466012 podman[214118]: 2025-10-02 11:56:50.441266914 +0000 UTC m=+0.107976797 container exec_died a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  2 07:56:50 np0005466012 systemd[1]: libpod-conmon-a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.scope: Deactivated successfully.
Oct  2 07:56:51 np0005466012 python3.9[214302]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  2 07:56:51 np0005466012 systemd[1]: Started libpod-conmon-a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.scope.
Oct  2 07:56:51 np0005466012 podman[214303]: 2025-10-02 11:56:51.378514918 +0000 UTC m=+0.172346454 container exec a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 07:56:51 np0005466012 podman[214323]: 2025-10-02 11:56:51.476875554 +0000 UTC m=+0.085825068 container exec_died a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, config_id=edpm)
Oct  2 07:56:51 np0005466012 podman[214303]: 2025-10-02 11:56:51.51747219 +0000 UTC m=+0.311303726 container exec_died a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git)
Oct  2 07:56:51 np0005466012 systemd[1]: libpod-conmon-a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410.scope: Deactivated successfully.
Oct  2 07:56:51 np0005466012 podman[214319]: 2025-10-02 11:56:51.621808507 +0000 UTC m=+0.229576169 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 07:56:52 np0005466012 python3.9[214507]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:56:54 np0005466012 podman[214532]: 2025-10-02 11:56:54.133576655 +0000 UTC m=+0.053499447 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 07:56:58 np0005466012 podman[214553]: 2025-10-02 11:56:58.136408739 +0000 UTC m=+0.050549996 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:56:59 np0005466012 podman[214573]: 2025-10-02 11:56:59.139943015 +0000 UTC m=+0.055850849 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:57:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:57:02.106 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:57:02.106 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:57:02.106 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:03 np0005466012 nova_compute[192063]: 2025-10-02 11:57:03.111 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:03 np0005466012 nova_compute[192063]: 2025-10-02 11:57:03.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:03 np0005466012 nova_compute[192063]: 2025-10-02 11:57:03.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.825 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.825 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:04 np0005466012 nova_compute[192063]: 2025-10-02 11:57:04.850 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.005 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.006 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5980MB free_disk=73.5024528503418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.006 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.006 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.073 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.073 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.099 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.114 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.115 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:57:05 np0005466012 nova_compute[192063]: 2025-10-02 11:57:05.115 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:57:06 np0005466012 nova_compute[192063]: 2025-10-02 11:57:06.113 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:06 np0005466012 nova_compute[192063]: 2025-10-02 11:57:06.113 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:57:06 np0005466012 nova_compute[192063]: 2025-10-02 11:57:06.114 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:57:06 np0005466012 nova_compute[192063]: 2025-10-02 11:57:06.138 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:57:06 np0005466012 nova_compute[192063]: 2025-10-02 11:57:06.138 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:57:09 np0005466012 podman[214597]: 2025-10-02 11:57:09.153448733 +0000 UTC m=+0.071871640 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Oct  2 07:57:11 np0005466012 podman[214623]: 2025-10-02 11:57:11.125684323 +0000 UTC m=+0.046751032 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 07:57:13 np0005466012 podman[214648]: 2025-10-02 11:57:13.128180369 +0000 UTC m=+0.047056531 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:57:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:57:17 np0005466012 podman[214766]: 2025-10-02 11:57:17.799534944 +0000 UTC m=+0.057038442 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:57:18 np0005466012 python3.9[214815]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:18 np0005466012 python3.9[214967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:19 np0005466012 python3.9[215090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759406238.3088198-3310-222295706660359/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:20 np0005466012 python3.9[215242]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:21 np0005466012 python3.9[215394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:21 np0005466012 podman[215444]: 2025-10-02 11:57:21.942973123 +0000 UTC m=+0.082407016 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:57:22 np0005466012 python3.9[215490]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:22 np0005466012 python3.9[215643]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:23 np0005466012 python3.9[215721]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.lypebgr0 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:24 np0005466012 python3.9[215873]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:24 np0005466012 podman[215923]: 2025-10-02 11:57:24.758436561 +0000 UTC m=+0.045591053 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 07:57:24 np0005466012 python3.9[215970]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:25 np0005466012 python3.9[216125]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:26 np0005466012 python3[216278]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  2 07:57:27 np0005466012 python3.9[216430]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:28 np0005466012 podman[216508]: 2025-10-02 11:57:28.266495637 +0000 UTC m=+0.080543016 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  2 07:57:28 np0005466012 python3.9[216509]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:29 np0005466012 python3.9[216679]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:29 np0005466012 podman[216729]: 2025-10-02 11:57:29.666964499 +0000 UTC m=+0.116531098 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:57:29 np0005466012 python3.9[216776]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:30 np0005466012 python3.9[216933]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:31 np0005466012 python3.9[217011]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:32 np0005466012 python3.9[217163]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:32 np0005466012 python3.9[217241]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:33 np0005466012 python3.9[217393]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  2 07:57:34 np0005466012 python3.9[217518]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759406252.8208091-3685-256546887744142/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:34 np0005466012 python3.9[217670]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:35 np0005466012 python3.9[217822]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:36 np0005466012 python3.9[217977]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:37 np0005466012 python3.9[218129]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:38 np0005466012 python3.9[218282]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  2 07:57:38 np0005466012 systemd[1]: packagekit.service: Deactivated successfully.
Oct  2 07:57:38 np0005466012 python3.9[218436]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  2 07:57:39 np0005466012 podman[218563]: 2025-10-02 11:57:39.791543126 +0000 UTC m=+0.082930118 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 07:57:39 np0005466012 python3.9[218610]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  2 07:57:40 np0005466012 systemd[1]: session-28.scope: Deactivated successfully.
Oct  2 07:57:40 np0005466012 systemd[1]: session-28.scope: Consumed 1min 41.694s CPU time.
Oct  2 07:57:40 np0005466012 systemd-logind[827]: Session 28 logged out. Waiting for processes to exit.
Oct  2 07:57:40 np0005466012 systemd-logind[827]: Removed session 28.
Oct  2 07:57:42 np0005466012 podman[218641]: 2025-10-02 11:57:42.13310092 +0000 UTC m=+0.050904776 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:57:44 np0005466012 podman[218666]: 2025-10-02 11:57:44.166334418 +0000 UTC m=+0.077852311 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 07:57:48 np0005466012 podman[218686]: 2025-10-02 11:57:48.180967711 +0000 UTC m=+0.103674254 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:57:52 np0005466012 podman[218707]: 2025-10-02 11:57:52.173850911 +0000 UTC m=+0.081610911 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 07:57:55 np0005466012 podman[218728]: 2025-10-02 11:57:55.181129715 +0000 UTC m=+0.082773793 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Oct  2 07:57:59 np0005466012 podman[218750]: 2025-10-02 11:57:59.181521146 +0000 UTC m=+0.086769171 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 07:58:00 np0005466012 podman[218770]: 2025-10-02 11:58:00.14578743 +0000 UTC m=+0.064873782 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:58:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:58:02.106 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:58:02.107 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:58:02.107 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:03 np0005466012 nova_compute[192063]: 2025-10-02 11:58:03.844 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:04 np0005466012 nova_compute[192063]: 2025-10-02 11:58:04.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:05 np0005466012 nova_compute[192063]: 2025-10-02 11:58:05.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:05 np0005466012 nova_compute[192063]: 2025-10-02 11:58:05.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:05 np0005466012 nova_compute[192063]: 2025-10-02 11:58:05.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:05 np0005466012 nova_compute[192063]: 2025-10-02 11:58:05.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.819 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.839 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.839 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.851 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.851 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.851 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.908 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.909 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.909 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:06 np0005466012 nova_compute[192063]: 2025-10-02 11:58:06.909 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.069 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.070 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6019MB free_disk=73.5024299621582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.070 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.070 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.134 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.134 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.201 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.225 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.228 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:58:07 np0005466012 nova_compute[192063]: 2025-10-02 11:58:07.228 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:58:08 np0005466012 nova_compute[192063]: 2025-10-02 11:58:08.201 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:58:10 np0005466012 podman[218795]: 2025-10-02 11:58:10.159734731 +0000 UTC m=+0.078552029 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:58:13 np0005466012 podman[218822]: 2025-10-02 11:58:13.143674099 +0000 UTC m=+0.062135590 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 07:58:15 np0005466012 podman[218846]: 2025-10-02 11:58:15.15508008 +0000 UTC m=+0.067737250 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:58:19 np0005466012 podman[218867]: 2025-10-02 11:58:19.166091435 +0000 UTC m=+0.069273210 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 07:58:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:58:19.890 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:58:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:58:19.890 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:58:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:58:19.891 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:58:23 np0005466012 podman[218887]: 2025-10-02 11:58:23.132616618 +0000 UTC m=+0.055280414 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 07:58:26 np0005466012 podman[218907]: 2025-10-02 11:58:26.2024105 +0000 UTC m=+0.114379791 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, config_id=edpm)
Oct  2 07:58:30 np0005466012 podman[218929]: 2025-10-02 11:58:30.140046437 +0000 UTC m=+0.060608738 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 07:58:31 np0005466012 podman[218949]: 2025-10-02 11:58:31.139997459 +0000 UTC m=+0.060840505 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 07:58:41 np0005466012 podman[218975]: 2025-10-02 11:58:41.213734963 +0000 UTC m=+0.124838356 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:58:44 np0005466012 podman[219002]: 2025-10-02 11:58:44.164801981 +0000 UTC m=+0.072464478 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 07:58:46 np0005466012 podman[219026]: 2025-10-02 11:58:46.144395108 +0000 UTC m=+0.061680068 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 07:58:50 np0005466012 podman[219045]: 2025-10-02 11:58:50.13700622 +0000 UTC m=+0.054710770 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  2 07:58:54 np0005466012 podman[219067]: 2025-10-02 11:58:54.199889871 +0000 UTC m=+0.111704493 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 07:58:57 np0005466012 podman[219087]: 2025-10-02 11:58:57.147030164 +0000 UTC m=+0.064587146 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 07:59:01 np0005466012 podman[219108]: 2025-10-02 11:59:01.175966772 +0000 UTC m=+0.077077961 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  2 07:59:01 np0005466012 podman[219130]: 2025-10-02 11:59:01.286992726 +0000 UTC m=+0.077632427 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 07:59:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:59:02.107 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:59:02.108 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:59:02.108 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:02 np0005466012 nova_compute[192063]: 2025-10-02 11:59:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:02 np0005466012 nova_compute[192063]: 2025-10-02 11:59:02.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 07:59:02 np0005466012 nova_compute[192063]: 2025-10-02 11:59:02.913 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 07:59:02 np0005466012 nova_compute[192063]: 2025-10-02 11:59:02.915 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:02 np0005466012 nova_compute[192063]: 2025-10-02 11:59:02.915 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 07:59:02 np0005466012 nova_compute[192063]: 2025-10-02 11:59:02.930 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:04 np0005466012 nova_compute[192063]: 2025-10-02 11:59:04.939 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:04 np0005466012 nova_compute[192063]: 2025-10-02 11:59:04.940 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:05 np0005466012 nova_compute[192063]: 2025-10-02 11:59:05.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:05 np0005466012 nova_compute[192063]: 2025-10-02 11:59:05.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:05 np0005466012 nova_compute[192063]: 2025-10-02 11:59:05.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:05 np0005466012 nova_compute[192063]: 2025-10-02 11:59:05.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 07:59:06 np0005466012 nova_compute[192063]: 2025-10-02 11:59:06.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:06 np0005466012 nova_compute[192063]: 2025-10-02 11:59:06.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 07:59:06 np0005466012 nova_compute[192063]: 2025-10-02 11:59:06.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.063 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.064 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.065 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.100 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.101 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.101 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.102 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.288 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.289 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6056MB free_disk=73.50253677368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.289 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.289 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.359 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.359 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.380 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.393 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.395 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 07:59:07 np0005466012 nova_compute[192063]: 2025-10-02 11:59:07.395 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 07:59:10 np0005466012 nova_compute[192063]: 2025-10-02 11:59:10.153 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 07:59:12 np0005466012 podman[219154]: 2025-10-02 11:59:12.155625462 +0000 UTC m=+0.072992282 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 07:59:15 np0005466012 podman[219181]: 2025-10-02 11:59:15.193107492 +0000 UTC m=+0.099572726 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 11:59:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 07:59:17 np0005466012 podman[219205]: 2025-10-02 11:59:17.16938615 +0000 UTC m=+0.075716265 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:59:21 np0005466012 podman[219225]: 2025-10-02 11:59:21.160732519 +0000 UTC m=+0.074948224 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 07:59:25 np0005466012 podman[219245]: 2025-10-02 11:59:25.158432828 +0000 UTC m=+0.072530910 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 07:59:28 np0005466012 podman[219264]: 2025-10-02 11:59:28.139648727 +0000 UTC m=+0.062373617 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Oct  2 07:59:32 np0005466012 podman[219285]: 2025-10-02 11:59:32.145791232 +0000 UTC m=+0.058692357 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:59:32 np0005466012 podman[219286]: 2025-10-02 11:59:32.152465951 +0000 UTC m=+0.058351008 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 07:59:43 np0005466012 podman[219333]: 2025-10-02 11:59:43.203071759 +0000 UTC m=+0.121854606 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 07:59:46 np0005466012 podman[219361]: 2025-10-02 11:59:46.159280015 +0000 UTC m=+0.070769476 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 07:59:48 np0005466012 podman[219386]: 2025-10-02 11:59:48.145914 +0000 UTC m=+0.062913028 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 07:59:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:59:51.469 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 07:59:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:59:51.471 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 07:59:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 11:59:51.472 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 07:59:52 np0005466012 podman[219408]: 2025-10-02 11:59:52.151092186 +0000 UTC m=+0.069234926 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 07:59:56 np0005466012 podman[219429]: 2025-10-02 11:59:56.168436464 +0000 UTC m=+0.082419676 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  2 07:59:59 np0005466012 podman[219452]: 2025-10-02 11:59:59.165335871 +0000 UTC m=+0.081960484 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc.)
Oct  2 08:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:02.108 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:02.110 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:02.110 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:03 np0005466012 podman[219475]: 2025-10-02 12:00:03.144732753 +0000 UTC m=+0.055523963 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:00:03 np0005466012 podman[219474]: 2025-10-02 12:00:03.164101016 +0000 UTC m=+0.071138616 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid)
Oct  2 08:00:05 np0005466012 nova_compute[192063]: 2025-10-02 12:00:05.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:05 np0005466012 nova_compute[192063]: 2025-10-02 12:00:05.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:06 np0005466012 nova_compute[192063]: 2025-10-02 12:00:06.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:06 np0005466012 nova_compute[192063]: 2025-10-02 12:00:06.851 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:06 np0005466012 nova_compute[192063]: 2025-10-02 12:00:06.851 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:06 np0005466012 nova_compute[192063]: 2025-10-02 12:00:06.852 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:00:07 np0005466012 nova_compute[192063]: 2025-10-02 12:00:07.828 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:07 np0005466012 nova_compute[192063]: 2025-10-02 12:00:07.828 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:00:07 np0005466012 nova_compute[192063]: 2025-10-02 12:00:07.829 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:00:07 np0005466012 nova_compute[192063]: 2025-10-02 12:00:07.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:00:07 np0005466012 nova_compute[192063]: 2025-10-02 12:00:07.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:07 np0005466012 nova_compute[192063]: 2025-10-02 12:00:07.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.845 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.845 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.845 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.980 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.981 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6067MB free_disk=73.50265121459961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.981 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:08 np0005466012 nova_compute[192063]: 2025-10-02 12:00:08.981 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.209 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.209 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.333 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.361 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.362 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.514 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.545 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.567 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.633 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.635 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:00:09 np0005466012 nova_compute[192063]: 2025-10-02 12:00:09.636 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:10 np0005466012 nova_compute[192063]: 2025-10-02 12:00:10.635 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.103 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.103 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.120 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.274 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.275 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.296 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.296 2 INFO nova.compute.claims [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.494 2 DEBUG nova.compute.provider_tree [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.574 2 DEBUG nova.scheduler.client.report [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.641 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.641 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.877 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.877 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.905 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:00:11 np0005466012 nova_compute[192063]: 2025-10-02 12:00:11.928 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.135 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.136 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.137 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Creating image(s)#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.138 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "/var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.138 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "/var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.139 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "/var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.139 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:12 np0005466012 nova_compute[192063]: 2025-10-02 12:00:12.140 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:14 np0005466012 podman[219519]: 2025-10-02 12:00:14.159561401 +0000 UTC m=+0.081611564 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:00:14 np0005466012 nova_compute[192063]: 2025-10-02 12:00:14.271 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Automatically allocating a network for project 23de7e9a877e477cb52ac4d4c1410e0d. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Oct  2 08:00:14 np0005466012 nova_compute[192063]: 2025-10-02 12:00:14.829 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:14 np0005466012 nova_compute[192063]: 2025-10-02 12:00:14.898 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:14 np0005466012 nova_compute[192063]: 2025-10-02 12:00:14.901 2 DEBUG nova.virt.images [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:00:14 np0005466012 nova_compute[192063]: 2025-10-02 12:00:14.903 2 DEBUG nova.privsep.utils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:00:14 np0005466012 nova_compute[192063]: 2025-10-02 12:00:14.903 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.132 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.part /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.138 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.188 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955.converted --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.189 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.201 2 INFO oslo.privsep.daemon [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpwhlfl2wr/privsep.sock']#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.910 2 INFO oslo.privsep.daemon [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.778 56 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.781 56 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.783 56 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  2 08:00:15 np0005466012 nova_compute[192063]: 2025-10-02 12:00:15.783 56 INFO oslo.privsep.daemon [-] privsep daemon running as pid 56#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.015 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.064 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.066 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.067 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.081 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.135 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.136 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.166 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.168 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.169 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.221 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.223 2 DEBUG nova.virt.disk.api [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Checking if we can resize image /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.223 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.290 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.292 2 DEBUG nova.virt.disk.api [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Cannot resize image /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.293 2 DEBUG nova.objects.instance [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'migration_context' on Instance uuid 04751313-a42e-4119-953b-08c932c35ae6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.318 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.318 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Ensure instance console log exists: /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.319 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.320 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:16 np0005466012 nova_compute[192063]: 2025-10-02 12:00:16.320 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:17 np0005466012 podman[219582]: 2025-10-02 12:00:17.128598669 +0000 UTC m=+0.051578308 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:00:19 np0005466012 podman[219606]: 2025-10-02 12:00:19.168056783 +0000 UTC m=+0.079175909 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:00:23 np0005466012 podman[219628]: 2025-10-02 12:00:23.130370023 +0000 UTC m=+0.049289497 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:00:27 np0005466012 nova_compute[192063]: 2025-10-02 12:00:27.010 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Automatically allocated network: {'id': '0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'name': 'auto_allocated_network', 'tenant_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['6a2058e4-dc89-48d3-88fc-bc95dba8da8b', 'd2e1858b-8344-4341-91f5-cb724ceffc0a'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-10-02T12:00:14Z', 'updated_at': '2025-10-02T12:00:26Z', 'revision_number': 4, 'project_id': '23de7e9a877e477cb52ac4d4c1410e0d'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Oct  2 08:00:27 np0005466012 nova_compute[192063]: 2025-10-02 12:00:27.022 2 WARNING oslo_policy.policy [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  2 08:00:27 np0005466012 nova_compute[192063]: 2025-10-02 12:00:27.022 2 WARNING oslo_policy.policy [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  2 08:00:27 np0005466012 nova_compute[192063]: 2025-10-02 12:00:27.025 2 DEBUG nova.policy [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4e1cdf41d58b4774b94da988b9e8db73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:00:27 np0005466012 podman[219650]: 2025-10-02 12:00:27.162658967 +0000 UTC m=+0.085545029 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:00:28 np0005466012 nova_compute[192063]: 2025-10-02 12:00:28.517 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Successfully created port: 681b75fc-f419-495c-8cac-c7fccc8e2170 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:00:30 np0005466012 podman[219671]: 2025-10-02 12:00:30.140983421 +0000 UTC m=+0.059356825 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.004 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Successfully updated port: 681b75fc-f419-495c-8cac-c7fccc8e2170 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.033 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "refresh_cache-04751313-a42e-4119-953b-08c932c35ae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.033 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquired lock "refresh_cache-04751313-a42e-4119-953b-08c932c35ae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.033 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.655 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.718 2 DEBUG nova.compute.manager [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-changed-681b75fc-f419-495c-8cac-c7fccc8e2170 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.719 2 DEBUG nova.compute.manager [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Refreshing instance network info cache due to event network-changed-681b75fc-f419-495c-8cac-c7fccc8e2170. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:00:31 np0005466012 nova_compute[192063]: 2025-10-02 12:00:31.719 2 DEBUG oslo_concurrency.lockutils [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-04751313-a42e-4119-953b-08c932c35ae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:34 np0005466012 podman[219692]: 2025-10-02 12:00:34.189996039 +0000 UTC m=+0.094389943 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:00:34 np0005466012 podman[219693]: 2025-10-02 12:00:34.223044415 +0000 UTC m=+0.123007072 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.337 2 DEBUG nova.network.neutron [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Updating instance_info_cache with network_info: [{"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.372 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Releasing lock "refresh_cache-04751313-a42e-4119-953b-08c932c35ae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.372 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Instance network_info: |[{"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.372 2 DEBUG oslo_concurrency.lockutils [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-04751313-a42e-4119-953b-08c932c35ae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.372 2 DEBUG nova.network.neutron [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Refreshing network info cache for port 681b75fc-f419-495c-8cac-c7fccc8e2170 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.375 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Start _get_guest_xml network_info=[{"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.379 2 WARNING nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.384 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.384 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.389 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.390 2 DEBUG nova.virt.libvirt.host [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.391 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.391 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.391 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.391 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.392 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.392 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.392 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.392 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.392 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.392 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.393 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.393 2 DEBUG nova.virt.hardware [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.396 2 DEBUG nova.privsep.utils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.397 2 DEBUG nova.virt.libvirt.vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-549213814-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-549213814-1',id=2,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='23de7e9a877e477cb52ac4d4c1410e0d',ramdisk_id='',reservation_id='r-2zs6ym1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1436985778',owner_user_name='tempest-AutoAllocateNetworkTest-1436985778-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:00:11Z,user_data=None,user_id='4e1cdf41d58b4774b94da988b9e8db73',uuid=04751313-a42e-4119-953b-08c932c35ae6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.398 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converting VIF {"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.398 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.400 2 DEBUG nova.objects.instance [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'pci_devices' on Instance uuid 04751313-a42e-4119-953b-08c932c35ae6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.417 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <uuid>04751313-a42e-4119-953b-08c932c35ae6</uuid>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <name>instance-00000002</name>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:name>tempest-tempest.common.compute-instance-549213814-1</nova:name>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:00:34</nova:creationTime>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:user uuid="4e1cdf41d58b4774b94da988b9e8db73">tempest-AutoAllocateNetworkTest-1436985778-project-member</nova:user>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:project uuid="23de7e9a877e477cb52ac4d4c1410e0d">tempest-AutoAllocateNetworkTest-1436985778</nova:project>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        <nova:port uuid="681b75fc-f419-495c-8cac-c7fccc8e2170">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.1.0.18" ipVersion="4"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="fdfe:381f:8400::1b8" ipVersion="6"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <entry name="serial">04751313-a42e-4119-953b-08c932c35ae6</entry>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <entry name="uuid">04751313-a42e-4119-953b-08c932c35ae6</entry>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.config"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:ce:2a:db"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <target dev="tap681b75fc-f4"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/console.log" append="off"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:00:34 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:00:34 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:00:34 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:00:34 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.419 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Preparing to wait for external event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.419 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.419 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.419 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.420 2 DEBUG nova.virt.libvirt.vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-549213814-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-549213814-1',id=2,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='23de7e9a877e477cb52ac4d4c1410e0d',ramdisk_id='',reservation_id='r-2zs6ym1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1436985778',owner_user_name='tempest-AutoAllocateNetworkTest-1436985778-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:00:11Z,user_data=None,user_id='4e1cdf41d58b4774b94da988b9e8db73',uuid=04751313-a42e-4119-953b-08c932c35ae6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.420 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converting VIF {"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.420 2 DEBUG nova.network.os_vif_util [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.421 2 DEBUG os_vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.450 2 DEBUG ovsdbapp.backend.ovs_idl [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.451 2 DEBUG ovsdbapp.backend.ovs_idl [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.451 2 DEBUG ovsdbapp.backend.ovs_idl [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.465 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.465 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:00:34 np0005466012 nova_compute[192063]: 2025-10-02 12:00:34.466 2 INFO oslo.privsep.daemon [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpg_sygack/privsep.sock']#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.109 2 INFO oslo.privsep.daemon [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.005 77 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.011 77 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.015 77 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.015 77 INFO oslo.privsep.daemon [-] privsep daemon running as pid 77#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.410 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap681b75fc-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.411 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap681b75fc-f4, col_values=(('external_ids', {'iface-id': '681b75fc-f419-495c-8cac-c7fccc8e2170', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:2a:db', 'vm-uuid': '04751313-a42e-4119-953b-08c932c35ae6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:35 np0005466012 NetworkManager[51207]: <info>  [1759406435.4154] manager: (tap681b75fc-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.425 2 INFO os_vif [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4')#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.476 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.477 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.477 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] No VIF found with MAC fa:16:3e:ce:2a:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:00:35 np0005466012 nova_compute[192063]: 2025-10-02 12:00:35.478 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Using config drive#033[00m
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.457 2 INFO nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Creating config drive at /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.config#033[00m
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.461 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf3_6lozo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.587 2 DEBUG oslo_concurrency.processutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf3_6lozo" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:36 np0005466012 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  2 08:00:36 np0005466012 kernel: tap681b75fc-f4: entered promiscuous mode
Oct  2 08:00:36 np0005466012 NetworkManager[51207]: <info>  [1759406436.6865] manager: (tap681b75fc-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:36Z|00027|binding|INFO|Claiming lport 681b75fc-f419-495c-8cac-c7fccc8e2170 for this chassis.
Oct  2 08:00:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:36Z|00028|binding|INFO|681b75fc-f419-495c-8cac-c7fccc8e2170: Claiming fa:16:3e:ce:2a:db 10.1.0.18 fdfe:381f:8400::1b8
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:36 np0005466012 systemd-udevd[219766]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:00:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:36.717 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:2a:db 10.1.0.18 fdfe:381f:8400::1b8'], port_security=['fa:16:3e:ce:2a:db 10.1.0.18 fdfe:381f:8400::1b8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.18/26 fdfe:381f:8400::1b8/64', 'neutron:device_id': '04751313-a42e-4119-953b-08c932c35ae6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6166ab66-e763-4e6f-ba6d-1725486f45f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cab3463-7636-46ad-b75d-f72d7d1739eb, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=681b75fc-f419-495c-8cac-c7fccc8e2170) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:36.719 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 681b75fc-f419-495c-8cac-c7fccc8e2170 in datapath 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 bound to our chassis#033[00m
Oct  2 08:00:36 np0005466012 NetworkManager[51207]: <info>  [1759406436.7266] device (tap681b75fc-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:00:36 np0005466012 NetworkManager[51207]: <info>  [1759406436.7279] device (tap681b75fc-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:00:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:36.723 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6#033[00m
Oct  2 08:00:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:36.725 103246 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpld7pxkvg/privsep.sock']#033[00m
Oct  2 08:00:36 np0005466012 systemd-machined[152114]: New machine qemu-1-instance-00000002.
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:36 np0005466012 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Oct  2 08:00:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:36Z|00029|binding|INFO|Setting lport 681b75fc-f419-495c-8cac-c7fccc8e2170 ovn-installed in OVS
Oct  2 08:00:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:36Z|00030|binding|INFO|Setting lport 681b75fc-f419-495c-8cac-c7fccc8e2170 up in Southbound
Oct  2 08:00:36 np0005466012 nova_compute[192063]: 2025-10-02 12:00:36.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.351 2 DEBUG nova.compute.manager [req-9d224de1-66ce-4a35-b50b-a482bd844d4c req-7f206070-9126-470a-8f87-86c4d92b51d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.351 2 DEBUG oslo_concurrency.lockutils [req-9d224de1-66ce-4a35-b50b-a482bd844d4c req-7f206070-9126-470a-8f87-86c4d92b51d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.352 2 DEBUG oslo_concurrency.lockutils [req-9d224de1-66ce-4a35-b50b-a482bd844d4c req-7f206070-9126-470a-8f87-86c4d92b51d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.352 2 DEBUG oslo_concurrency.lockutils [req-9d224de1-66ce-4a35-b50b-a482bd844d4c req-7f206070-9126-470a-8f87-86c4d92b51d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.353 2 DEBUG nova.compute.manager [req-9d224de1-66ce-4a35-b50b-a482bd844d4c req-7f206070-9126-470a-8f87-86c4d92b51d9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Processing event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.430 103246 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.431 103246 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpld7pxkvg/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.324 219792 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.328 219792 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.330 219792 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.330 219792 INFO oslo.privsep.daemon [-] privsep daemon running as pid 219792#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.435 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b6d8b1-7d15-48f3-b375-664eef641161]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.554 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.555 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406437.5538335, 04751313-a42e-4119-953b-08c932c35ae6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.556 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] VM Started (Lifecycle Event)#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.559 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.569 2 INFO nova.virt.libvirt.driver [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Instance spawned successfully.#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.570 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.584 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.589 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.601 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.602 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.603 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.603 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.604 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.605 2 DEBUG nova.virt.libvirt.driver [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.635 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.636 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406437.554977, 04751313-a42e-4119-953b-08c932c35ae6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.637 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.677 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.682 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406437.5676737, 04751313-a42e-4119-953b-08c932c35ae6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.682 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.711 2 INFO nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Took 25.58 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.712 2 DEBUG nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.715 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.723 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.825 2 INFO nova.compute.manager [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Took 26.64 seconds to build instance.#033[00m
Oct  2 08:00:37 np0005466012 nova_compute[192063]: 2025-10-02 12:00:37.854 2 DEBUG oslo_concurrency.lockutils [None req-35f6bb1e-0275-4338-8408-4659dad7719f 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.920 219792 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.920 219792 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:37.920 219792 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:38 np0005466012 nova_compute[192063]: 2025-10-02 12:00:38.100 2 DEBUG nova.network.neutron [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Updated VIF entry in instance network info cache for port 681b75fc-f419-495c-8cac-c7fccc8e2170. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:00:38 np0005466012 nova_compute[192063]: 2025-10-02 12:00:38.100 2 DEBUG nova.network.neutron [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Updating instance_info_cache with network_info: [{"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:38 np0005466012 nova_compute[192063]: 2025-10-02 12:00:38.124 2 DEBUG oslo_concurrency.lockutils [req-0f08213a-fbab-4a5a-8c35-b43501be4c3e req-e31ab34a-4462-441b-926d-32c5139752a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-04751313-a42e-4119-953b-08c932c35ae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.470 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc9ea5a-07ba-4454-89c4-c485cb2b022d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.471 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0e6cbdbf-b1 in ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.473 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0e6cbdbf-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.473 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5451e8-5393-4a7a-a657-f724eecf286d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.475 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[18601ecc-1cb5-4eed-8be7-fb7beadceb95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.504 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e4048717-c64d-4186-a8a2-40c435c6f98f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.530 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4341b1-2b82-4aa0-8223-920e8ad9971a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:38.532 103246 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmprjyajhjw/privsep.sock']#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.199 103246 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.199 103246 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprjyajhjw/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.082 219806 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.089 219806 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.093 219806 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.094 219806 INFO oslo.privsep.daemon [-] privsep daemon running as pid 219806#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.202 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[39814979-fcf3-4956-a428-fa1ada8a5f63]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.664 219806 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.664 219806 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:39.664 219806 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.730 2 DEBUG nova.compute.manager [req-c310fefa-f147-496f-a9a6-fe6116e4fa8d req-dc6f900b-f5f4-4c2e-a9f2-4fc437f892a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.730 2 DEBUG oslo_concurrency.lockutils [req-c310fefa-f147-496f-a9a6-fe6116e4fa8d req-dc6f900b-f5f4-4c2e-a9f2-4fc437f892a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.731 2 DEBUG oslo_concurrency.lockutils [req-c310fefa-f147-496f-a9a6-fe6116e4fa8d req-dc6f900b-f5f4-4c2e-a9f2-4fc437f892a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.731 2 DEBUG oslo_concurrency.lockutils [req-c310fefa-f147-496f-a9a6-fe6116e4fa8d req-dc6f900b-f5f4-4c2e-a9f2-4fc437f892a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.731 2 DEBUG nova.compute.manager [req-c310fefa-f147-496f-a9a6-fe6116e4fa8d req-dc6f900b-f5f4-4c2e-a9f2-4fc437f892a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] No waiting events found dispatching network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:00:39 np0005466012 nova_compute[192063]: 2025-10-02 12:00:39.731 2 WARNING nova.compute.manager [req-c310fefa-f147-496f-a9a6-fe6116e4fa8d req-dc6f900b-f5f4-4c2e-a9f2-4fc437f892a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received unexpected event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.223 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6e14a276-3e2e-4290-a7a0-759742f87aa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 NetworkManager[51207]: <info>  [1759406440.2306] manager: (tap0e6cbdbf-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.229 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[270db901-b5d3-4f97-9b47-423847c23a1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 systemd-udevd[219815]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.277 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2a667c-4ba3-4c23-be3b-c14bf83cf234]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.280 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[87568b73-536a-4951-99f2-c73f36d68663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 NetworkManager[51207]: <info>  [1759406440.2986] device (tap0e6cbdbf-b0): carrier: link connected
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.302 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[93a27fbf-e9dc-4832-b1d5-8f895caa3d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.319 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[262d6935-e138-41d7-aaa9-e3ed6f2e45d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e6cbdbf-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:05:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443391, 'reachable_time': 27956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219835, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.336 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[462bdd6c-08b4-41b4-a905-71a01febf2ed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:520'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443391, 'tstamp': 443391}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219836, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.349 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c403fe9f-2450-4dc3-9ee6-ac7b26d9fb92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e6cbdbf-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:05:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443391, 'reachable_time': 27956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219837, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.375 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d5fdf7-87b4-4644-9c77-46183d722c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 nova_compute[192063]: 2025-10-02 12:00:40.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.424 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ba333405-5139-4c0f-85e0-28cf1b837c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.425 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e6cbdbf-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.425 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.426 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e6cbdbf-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:40 np0005466012 NetworkManager[51207]: <info>  [1759406440.4281] manager: (tap0e6cbdbf-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct  2 08:00:40 np0005466012 nova_compute[192063]: 2025-10-02 12:00:40.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:40 np0005466012 kernel: tap0e6cbdbf-b0: entered promiscuous mode
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.430 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0e6cbdbf-b0, col_values=(('external_ids', {'iface-id': '0e5a8941-b399-4368-aa52-d99cb4bfefe5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:40Z|00031|binding|INFO|Releasing lport 0e5a8941-b399-4368-aa52-d99cb4bfefe5 from this chassis (sb_readonly=0)
Oct  2 08:00:40 np0005466012 nova_compute[192063]: 2025-10-02 12:00:40.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:40 np0005466012 nova_compute[192063]: 2025-10-02 12:00:40.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.445 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.446 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a94e3922-8d20-461f-b823-a7cf3da88d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.447 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.pid.haproxy
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:00:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:40.449 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'env', 'PROCESS_TAG=haproxy-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:00:40 np0005466012 podman[219870]: 2025-10-02 12:00:40.806277124 +0000 UTC m=+0.053884559 container create 807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:00:40 np0005466012 systemd[1]: Started libpod-conmon-807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50.scope.
Oct  2 08:00:40 np0005466012 podman[219870]: 2025-10-02 12:00:40.775837437 +0000 UTC m=+0.023444892 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:00:40 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:00:40 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/951d3fe3f411acfd7c74d5ad3f3db8433552a889820bfc4c00cda5faf6a1d647/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:00:40 np0005466012 podman[219870]: 2025-10-02 12:00:40.911403272 +0000 UTC m=+0.159010727 container init 807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:00:40 np0005466012 podman[219870]: 2025-10-02 12:00:40.917660807 +0000 UTC m=+0.165268252 container start 807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:00:40 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [NOTICE]   (219889) : New worker (219891) forked
Oct  2 08:00:40 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [NOTICE]   (219889) : Loading success.
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.488 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.488 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.510 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.716 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.717 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.724 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:00:44 np0005466012 nova_compute[192063]: 2025-10-02 12:00:44.724 2 INFO nova.compute.claims [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.001 2 DEBUG nova.compute.provider_tree [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.084 2 ERROR nova.scheduler.client.report [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [req-ad2fdbc0-228d-4830-8987-17990163bbd6] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID ddb6f967-9a8a-4554-9b44-b99536054f9c.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-ad2fdbc0-228d-4830-8987-17990163bbd6"}]}#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.107 2 DEBUG nova.scheduler.client.report [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.157 2 DEBUG nova.scheduler.client.report [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.158 2 DEBUG nova.compute.provider_tree [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:45 np0005466012 podman[219900]: 2025-10-02 12:00:45.168073194 +0000 UTC m=+0.079081568 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.172 2 DEBUG nova.scheduler.client.report [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.208 2 DEBUG nova.scheduler.client.report [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.295 2 DEBUG nova.compute.provider_tree [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.376 2 DEBUG nova.scheduler.client.report [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updated inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.376 2 DEBUG nova.compute.provider_tree [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updating resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.376 2 DEBUG nova.compute.provider_tree [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.437 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.438 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.631 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.631 2 DEBUG nova.network.neutron [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.665 2 INFO nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.703 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.915 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.916 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.917 2 INFO nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Creating image(s)#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.918 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.918 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.920 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:45 np0005466012 nova_compute[192063]: 2025-10-02 12:00:45.950 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.009 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.010 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.011 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.024 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.040 2 DEBUG nova.network.neutron [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.041 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.084 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.085 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.120 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.121 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.121 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.176 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.177 2 DEBUG nova.virt.disk.api [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Checking if we can resize image /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.178 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.235 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.237 2 DEBUG nova.virt.disk.api [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Cannot resize image /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.237 2 DEBUG nova.objects.instance [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'migration_context' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.266 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.266 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Ensure instance console log exists: /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.267 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.267 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.267 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.268 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.272 2 WARNING nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.294 2 DEBUG nova.virt.libvirt.host [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.295 2 DEBUG nova.virt.libvirt.host [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.299 2 DEBUG nova.virt.libvirt.host [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.299 2 DEBUG nova.virt.libvirt.host [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.301 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.302 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.302 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.303 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.303 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.303 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.304 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.304 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.305 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.305 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.306 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.306 2 DEBUG nova.virt.hardware [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.312 2 DEBUG nova.objects.instance [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.333 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <uuid>0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7</uuid>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <name>instance-00000006</name>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:name>tempest-MigrationsAdminTest-server-976277975</nova:name>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:00:46</nova:creationTime>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:user uuid="8da35688aa864e189f10b334a21bc6c4">tempest-MigrationsAdminTest-1651504538-project-member</nova:user>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:        <nova:project uuid="4dcc6c51db2640cbb04083b3336de813">tempest-MigrationsAdminTest-1651504538</nova:project>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <entry name="serial">0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7</entry>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <entry name="uuid">0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7</entry>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/console.log" append="off"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:00:46 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:00:46 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:00:46 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:00:46 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.504 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.505 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.505 2 INFO nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Using config drive#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.700 2 INFO nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Creating config drive at /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.704 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9v5tuwmo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:46 np0005466012 nova_compute[192063]: 2025-10-02 12:00:46.826 2 DEBUG oslo_concurrency.processutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9v5tuwmo" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:46 np0005466012 systemd-machined[152114]: New machine qemu-2-instance-00000006.
Oct  2 08:00:46 np0005466012 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Oct  2 08:00:47 np0005466012 podman[219969]: 2025-10-02 12:00:47.589431372 +0000 UTC m=+0.087928471 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:00:47 np0005466012 nova_compute[192063]: 2025-10-02 12:00:47.956 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:00:47 np0005466012 nova_compute[192063]: 2025-10-02 12:00:47.957 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:00:47 np0005466012 nova_compute[192063]: 2025-10-02 12:00:47.958 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406447.9555666, 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:47 np0005466012 nova_compute[192063]: 2025-10-02 12:00:47.958 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:00:47 np0005466012 nova_compute[192063]: 2025-10-02 12:00:47.967 2 INFO nova.virt.libvirt.driver [-] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance spawned successfully.#033[00m
Oct  2 08:00:47 np0005466012 nova_compute[192063]: 2025-10-02 12:00:47.968 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.002 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.013 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.019 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.019 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.020 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.021 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.022 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.023 2 DEBUG nova.virt.libvirt.driver [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.033 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.033 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406447.9558334, 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.034 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] VM Started (Lifecycle Event)#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.067 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.070 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.100 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.132 2 INFO nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Took 2.22 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.132 2 DEBUG nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.225 2 INFO nova.compute.manager [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Took 3.62 seconds to build instance.#033[00m
Oct  2 08:00:48 np0005466012 nova_compute[192063]: 2025-10-02 12:00:48.255 2 DEBUG oslo_concurrency.lockutils [None req-f39dec09-f44c-42d8-b1e7-159b42b2957b 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:49 np0005466012 nova_compute[192063]: 2025-10-02 12:00:49.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:49Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:2a:db 10.1.0.18
Oct  2 08:00:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:49Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:2a:db 10.1.0.18
Oct  2 08:00:50 np0005466012 podman[220013]: 2025-10-02 12:00:50.139616107 +0000 UTC m=+0.056136489 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:00:50 np0005466012 nova_compute[192063]: 2025-10-02 12:00:50.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.320 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.321 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.321 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.321 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.321 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.343 2 INFO nova.compute.manager [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Terminating instance#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.362 2 DEBUG nova.compute.manager [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:00:51 np0005466012 kernel: tap681b75fc-f4 (unregistering): left promiscuous mode
Oct  2 08:00:51 np0005466012 NetworkManager[51207]: <info>  [1759406451.3931] device (tap681b75fc-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:51Z|00032|binding|INFO|Releasing lport 681b75fc-f419-495c-8cac-c7fccc8e2170 from this chassis (sb_readonly=0)
Oct  2 08:00:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:51Z|00033|binding|INFO|Setting lport 681b75fc-f419-495c-8cac-c7fccc8e2170 down in Southbound
Oct  2 08:00:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:00:51Z|00034|binding|INFO|Removing iface tap681b75fc-f4 ovn-installed in OVS
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.447 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:2a:db 10.1.0.18 fdfe:381f:8400::1b8'], port_security=['fa:16:3e:ce:2a:db 10.1.0.18 fdfe:381f:8400::1b8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.18/26 fdfe:381f:8400::1b8/64', 'neutron:device_id': '04751313-a42e-4119-953b-08c932c35ae6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23de7e9a877e477cb52ac4d4c1410e0d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6166ab66-e763-4e6f-ba6d-1725486f45f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cab3463-7636-46ad-b75d-f72d7d1739eb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=681b75fc-f419-495c-8cac-c7fccc8e2170) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.449 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 681b75fc-f419-495c-8cac-c7fccc8e2170 in datapath 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 unbound from our chassis#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.451 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.451 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d332171b-2fe0-43ff-a8b9-88ac64100235]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.452 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 namespace which is not needed anymore#033[00m
Oct  2 08:00:51 np0005466012 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Oct  2 08:00:51 np0005466012 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 13.018s CPU time.
Oct  2 08:00:51 np0005466012 systemd-machined[152114]: Machine qemu-1-instance-00000002 terminated.
Oct  2 08:00:51 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [NOTICE]   (219889) : haproxy version is 2.8.14-c23fe91
Oct  2 08:00:51 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [NOTICE]   (219889) : path to executable is /usr/sbin/haproxy
Oct  2 08:00:51 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [WARNING]  (219889) : Exiting Master process...
Oct  2 08:00:51 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [WARNING]  (219889) : Exiting Master process...
Oct  2 08:00:51 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [ALERT]    (219889) : Current worker (219891) exited with code 143 (Terminated)
Oct  2 08:00:51 np0005466012 neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6[219885]: [WARNING]  (219889) : All workers exited. Exiting... (0)
Oct  2 08:00:51 np0005466012 systemd[1]: libpod-807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50.scope: Deactivated successfully.
Oct  2 08:00:51 np0005466012 podman[220054]: 2025-10-02 12:00:51.588580248 +0000 UTC m=+0.052616956 container died 807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.616 2 INFO nova.virt.libvirt.driver [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Instance destroyed successfully.#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.617 2 DEBUG nova.objects.instance [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lazy-loading 'resources' on Instance uuid 04751313-a42e-4119-953b-08c932c35ae6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50-userdata-shm.mount: Deactivated successfully.
Oct  2 08:00:51 np0005466012 systemd[1]: var-lib-containers-storage-overlay-951d3fe3f411acfd7c74d5ad3f3db8433552a889820bfc4c00cda5faf6a1d647-merged.mount: Deactivated successfully.
Oct  2 08:00:51 np0005466012 podman[220054]: 2025-10-02 12:00:51.646515555 +0000 UTC m=+0.110552253 container cleanup 807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.650 2 DEBUG nova.virt.libvirt.vif [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-549213814-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-549213814-1',id=2,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:00:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='23de7e9a877e477cb52ac4d4c1410e0d',ramdisk_id='',reservation_id='r-2zs6ym1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-1436985778',owner_user_name='tempest-AutoAllocateNetworkTest-1436985778-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:00:37Z,user_data=None,user_id='4e1cdf41d58b4774b94da988b9e8db73',uuid=04751313-a42e-4119-953b-08c932c35ae6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.650 2 DEBUG nova.network.os_vif_util [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converting VIF {"id": "681b75fc-f419-495c-8cac-c7fccc8e2170", "address": "fa:16:3e:ce:2a:db", "network": {"id": "0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1b8", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "23de7e9a877e477cb52ac4d4c1410e0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap681b75fc-f4", "ovs_interfaceid": "681b75fc-f419-495c-8cac-c7fccc8e2170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.651 2 DEBUG nova.network.os_vif_util [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.652 2 DEBUG os_vif [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:00:51 np0005466012 systemd[1]: libpod-conmon-807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50.scope: Deactivated successfully.
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.655 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap681b75fc-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.661 2 INFO os_vif [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:2a:db,bridge_name='br-int',has_traffic_filtering=True,id=681b75fc-f419-495c-8cac-c7fccc8e2170,network=Network(0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap681b75fc-f4')#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.662 2 INFO nova.virt.libvirt.driver [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Deleting instance files /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6_del#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.662 2 INFO nova.virt.libvirt.driver [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Deletion of /var/lib/nova/instances/04751313-a42e-4119-953b-08c932c35ae6_del complete#033[00m
Oct  2 08:00:51 np0005466012 podman[220102]: 2025-10-02 12:00:51.713021127 +0000 UTC m=+0.045417635 container remove 807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.718 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a8999533-0853-4606-921c-4faf33244d0b]: (4, ('Thu Oct  2 12:00:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 (807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50)\n807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50\nThu Oct  2 12:00:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 (807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50)\n807f3c611a60fa4a065470de21e28f6ee6dc1564e7150fadac9b4b4b1075da50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.719 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[59b656b8-cec9-4ca7-bf25-f77993bc7453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.720 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e6cbdbf-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:51 np0005466012 kernel: tap0e6cbdbf-b0: left promiscuous mode
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.735 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dfdd75c9-4c17-4b18-9f93-8cb741ecb3d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.758 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c82eda-d133-4a5e-8b7a-26b6febcaa87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.759 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2db68d8-13e9-479e-8b50-93177d9e4e19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.772 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aab06eaa-1b80-44b6-ae34-605a4f76dcf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443382, 'reachable_time': 20869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220115, 'error': None, 'target': 'ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 systemd[1]: run-netns-ovnmeta\x2d0e6cbdbf\x2db727\x2d48dc\x2d82d1\x2df7af5e6b3fc6.mount: Deactivated successfully.
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.782 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0e6cbdbf-b727-48dc-82d1-f7af5e6b3fc6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:00:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:51.782 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[cda318b4-5579-4027-9310-e33d7e93b8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.805 2 DEBUG nova.virt.libvirt.host [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.806 2 INFO nova.virt.libvirt.host [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] UEFI support detected#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.807 2 INFO nova.compute.manager [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.808 2 DEBUG oslo.service.loopingcall [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.808 2 DEBUG nova.compute.manager [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:00:51 np0005466012 nova_compute[192063]: 2025-10-02 12:00:51.808 2 DEBUG nova.network.neutron [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:00:52 np0005466012 nova_compute[192063]: 2025-10-02 12:00:52.322 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:00:52 np0005466012 nova_compute[192063]: 2025-10-02 12:00:52.323 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquired lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:00:52 np0005466012 nova_compute[192063]: 2025-10-02 12:00:52.323 2 DEBUG nova.network.neutron [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:00:53 np0005466012 nova_compute[192063]: 2025-10-02 12:00:53.144 2 DEBUG nova.network.neutron [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:54 np0005466012 podman[220117]: 2025-10-02 12:00:54.161526116 +0000 UTC m=+0.080109454 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.368 2 DEBUG nova.network.neutron [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.381 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Releasing lock "refresh_cache-0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:54.412 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:00:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:54.413 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.471 2 DEBUG nova.network.neutron [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.502 2 INFO nova.compute.manager [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Took 2.69 seconds to deallocate network for instance.#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.577 2 DEBUG nova.compute.manager [req-f3341ae4-3d1e-41f8-9fa6-ce75e1e758ee req-26b8b823-5b8f-441c-ba45-e8c609cda3fb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-vif-deleted-681b75fc-f419-495c-8cac-c7fccc8e2170 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.589 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.590 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Creating file /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/fb5757a14a7448e4b35a11d6424fd4ae.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.590 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/fb5757a14a7448e4b35a11d6424fd4ae.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.608 2 DEBUG nova.compute.manager [req-40f896c9-b89a-40fe-97b7-b237c842a422 req-4b718a0d-531f-4b8a-be28-ecaad658d18b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-vif-unplugged-681b75fc-f419-495c-8cac-c7fccc8e2170 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.609 2 DEBUG oslo_concurrency.lockutils [req-40f896c9-b89a-40fe-97b7-b237c842a422 req-4b718a0d-531f-4b8a-be28-ecaad658d18b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.609 2 DEBUG oslo_concurrency.lockutils [req-40f896c9-b89a-40fe-97b7-b237c842a422 req-4b718a0d-531f-4b8a-be28-ecaad658d18b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.609 2 DEBUG oslo_concurrency.lockutils [req-40f896c9-b89a-40fe-97b7-b237c842a422 req-4b718a0d-531f-4b8a-be28-ecaad658d18b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.609 2 DEBUG nova.compute.manager [req-40f896c9-b89a-40fe-97b7-b237c842a422 req-4b718a0d-531f-4b8a-be28-ecaad658d18b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] No waiting events found dispatching network-vif-unplugged-681b75fc-f419-495c-8cac-c7fccc8e2170 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.610 2 DEBUG nova.compute.manager [req-40f896c9-b89a-40fe-97b7-b237c842a422 req-4b718a0d-531f-4b8a-be28-ecaad658d18b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-vif-unplugged-681b75fc-f419-495c-8cac-c7fccc8e2170 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.618 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.618 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.687 2 DEBUG nova.compute.provider_tree [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.737 2 DEBUG nova.scheduler.client.report [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.760 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.785 2 INFO nova.scheduler.client.report [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Deleted allocations for instance 04751313-a42e-4119-953b-08c932c35ae6#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.862 2 DEBUG oslo_concurrency.lockutils [None req-1089f955-3593-4ff4-bc40-30dbaeaac716 4e1cdf41d58b4774b94da988b9e8db73 23de7e9a877e477cb52ac4d4c1410e0d - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.983 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/fb5757a14a7448e4b35a11d6424fd4ae.tmp" returned: 1 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.984 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/fb5757a14a7448e4b35a11d6424fd4ae.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.985 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Creating directory /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:00:54 np0005466012 nova_compute[192063]: 2025-10-02 12:00:54.985 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:55 np0005466012 nova_compute[192063]: 2025-10-02 12:00:55.184 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:55 np0005466012 nova_compute[192063]: 2025-10-02 12:00:55.189 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:00:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:00:55.416 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.679 2 DEBUG nova.compute.manager [req-06a5b94e-6492-4c29-9151-8caa7c211af6 req-b37a0976-92b0-46f4-a728-235f67ab4e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.679 2 DEBUG oslo_concurrency.lockutils [req-06a5b94e-6492-4c29-9151-8caa7c211af6 req-b37a0976-92b0-46f4-a728-235f67ab4e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "04751313-a42e-4119-953b-08c932c35ae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.680 2 DEBUG oslo_concurrency.lockutils [req-06a5b94e-6492-4c29-9151-8caa7c211af6 req-b37a0976-92b0-46f4-a728-235f67ab4e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.680 2 DEBUG oslo_concurrency.lockutils [req-06a5b94e-6492-4c29-9151-8caa7c211af6 req-b37a0976-92b0-46f4-a728-235f67ab4e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "04751313-a42e-4119-953b-08c932c35ae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.680 2 DEBUG nova.compute.manager [req-06a5b94e-6492-4c29-9151-8caa7c211af6 req-b37a0976-92b0-46f4-a728-235f67ab4e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] No waiting events found dispatching network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:00:56 np0005466012 nova_compute[192063]: 2025-10-02 12:00:56.680 2 WARNING nova.compute.manager [req-06a5b94e-6492-4c29-9151-8caa7c211af6 req-b37a0976-92b0-46f4-a728-235f67ab4e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Received unexpected event network-vif-plugged-681b75fc-f419-495c-8cac-c7fccc8e2170 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.804 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.805 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.819 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.904 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.905 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.910 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:00:57 np0005466012 nova_compute[192063]: 2025-10-02 12:00:57.911 2 INFO nova.compute.claims [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.049 2 DEBUG nova.compute.provider_tree [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.064 2 DEBUG nova.scheduler.client.report [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.083 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.084 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.139 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.140 2 DEBUG nova.network.neutron [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.156 2 INFO nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:00:58 np0005466012 podman[220140]: 2025-10-02 12:00:58.161029441 +0000 UTC m=+0.073177071 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.177 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.289 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.291 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.292 2 INFO nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Creating image(s)#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.293 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "/var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.293 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "/var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.294 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "/var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.311 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.340 2 DEBUG nova.policy [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.365 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.366 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.367 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.382 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.434 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.435 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.469 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.470 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.470 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.528 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.529 2 DEBUG nova.virt.disk.api [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Checking if we can resize image /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.529 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.588 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.589 2 DEBUG nova.virt.disk.api [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Cannot resize image /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.589 2 DEBUG nova.objects.instance [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lazy-loading 'migration_context' on Instance uuid a20c354d-a1af-4fad-958f-59623ebe4437 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.661 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.661 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Ensure instance console log exists: /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.662 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.662 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:00:58 np0005466012 nova_compute[192063]: 2025-10-02 12:00:58.662 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:00:59 np0005466012 nova_compute[192063]: 2025-10-02 12:00:59.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:00:59 np0005466012 nova_compute[192063]: 2025-10-02 12:00:59.220 2 DEBUG nova.network.neutron [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Successfully created port: 5562a861-2a3e-4411-8aaa-be6dde7a658a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.571 2 DEBUG nova.network.neutron [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Successfully updated port: 5562a861-2a3e-4411-8aaa-be6dde7a658a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.601 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.601 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquired lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.602 2 DEBUG nova.network.neutron [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.729 2 DEBUG nova.compute.manager [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-changed-5562a861-2a3e-4411-8aaa-be6dde7a658a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.730 2 DEBUG nova.compute.manager [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Refreshing instance network info cache due to event network-changed-5562a861-2a3e-4411-8aaa-be6dde7a658a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:01:00 np0005466012 nova_compute[192063]: 2025-10-02 12:01:00.730 2 DEBUG oslo_concurrency.lockutils [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:01 np0005466012 nova_compute[192063]: 2025-10-02 12:01:01.130 2 DEBUG nova.network.neutron [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:01:01 np0005466012 podman[220184]: 2025-10-02 12:01:01.143598948 +0000 UTC m=+0.061876351 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Oct  2 08:01:01 np0005466012 nova_compute[192063]: 2025-10-02 12:01:01.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:02.109 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:02.109 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:02.110 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.640 2 DEBUG nova.network.neutron [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Updating instance_info_cache with network_info: [{"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.828 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Releasing lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.829 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Instance network_info: |[{"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.830 2 DEBUG oslo_concurrency.lockutils [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.830 2 DEBUG nova.network.neutron [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Refreshing network info cache for port 5562a861-2a3e-4411-8aaa-be6dde7a658a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.833 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Start _get_guest_xml network_info=[{"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.836 2 WARNING nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.840 2 DEBUG nova.virt.libvirt.host [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.840 2 DEBUG nova.virt.libvirt.host [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.846 2 DEBUG nova.virt.libvirt.host [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.847 2 DEBUG nova.virt.libvirt.host [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.848 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.848 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.849 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.849 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.849 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.849 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.850 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.850 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.850 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.850 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.851 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.851 2 DEBUG nova.virt.hardware [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.854 2 DEBUG nova.virt.libvirt.vif [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:00:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1982637812',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1982637812',id=7,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-bvhrjcj5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:00:58Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=a20c354d-a1af-4fad-958f-59623ebe4437,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.854 2 DEBUG nova.network.os_vif_util [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converting VIF {"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.854 2 DEBUG nova.network.os_vif_util [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.855 2 DEBUG nova.objects.instance [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lazy-loading 'pci_devices' on Instance uuid a20c354d-a1af-4fad-958f-59623ebe4437 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.928 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <uuid>a20c354d-a1af-4fad-958f-59623ebe4437</uuid>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <name>instance-00000007</name>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1982637812</nova:name>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:01:02</nova:creationTime>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:user uuid="59e8135d73ee43e088ba5ee7d9bd84b1">tempest-LiveAutoBlockMigrationV225Test-984573444-project-member</nova:user>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:project uuid="5cc73d75e0864e838eefa90cb33b7e01">tempest-LiveAutoBlockMigrationV225Test-984573444</nova:project>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        <nova:port uuid="5562a861-2a3e-4411-8aaa-be6dde7a658a">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <entry name="serial">a20c354d-a1af-4fad-958f-59623ebe4437</entry>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <entry name="uuid">a20c354d-a1af-4fad-958f-59623ebe4437</entry>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.config"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:09:db:7c"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <target dev="tap5562a861-2a"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/console.log" append="off"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:01:02 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:01:02 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:01:02 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:01:02 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.931 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Preparing to wait for external event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.931 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.932 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.932 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.932 2 DEBUG nova.virt.libvirt.vif [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:00:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1982637812',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1982637812',id=7,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-bvhrjcj5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:00:58Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=a20c354d-a1af-4fad-958f-59623ebe4437,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.932 2 DEBUG nova.network.os_vif_util [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converting VIF {"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.933 2 DEBUG nova.network.os_vif_util [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.933 2 DEBUG os_vif [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.934 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.934 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.937 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5562a861-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.937 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5562a861-2a, col_values=(('external_ids', {'iface-id': '5562a861-2a3e-4411-8aaa-be6dde7a658a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:db:7c', 'vm-uuid': 'a20c354d-a1af-4fad-958f-59623ebe4437'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:02 np0005466012 NetworkManager[51207]: <info>  [1759406462.9394] manager: (tap5562a861-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:02 np0005466012 nova_compute[192063]: 2025-10-02 12:01:02.944 2 INFO os_vif [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a')#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.120 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.121 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.121 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] No VIF found with MAC fa:16:3e:09:db:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.122 2 INFO nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Using config drive#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.568 2 INFO nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Creating config drive at /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.config#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.575 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk4guzrbw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.702 2 DEBUG oslo_concurrency.processutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk4guzrbw" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:03 np0005466012 kernel: tap5562a861-2a: entered promiscuous mode
Oct  2 08:01:03 np0005466012 NetworkManager[51207]: <info>  [1759406463.7570] manager: (tap5562a861-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Oct  2 08:01:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:03Z|00035|binding|INFO|Claiming lport 5562a861-2a3e-4411-8aaa-be6dde7a658a for this chassis.
Oct  2 08:01:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:03Z|00036|binding|INFO|5562a861-2a3e-4411-8aaa-be6dde7a658a: Claiming fa:16:3e:09:db:7c 10.100.0.13
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.781 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:db:7c 10.100.0.13'], port_security=['fa:16:3e:09:db:7c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-020b4768-a07a-4769-8636-455566c87083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3fadef5-4bfc-406c-93c4-14d4abd0583e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c0be75-bb4b-4e01-8cfa-b9aa4fcaf0e9, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5562a861-2a3e-4411-8aaa-be6dde7a658a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.783 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5562a861-2a3e-4411-8aaa-be6dde7a658a in datapath 020b4768-a07a-4769-8636-455566c87083 bound to our chassis#033[00m
Oct  2 08:01:03 np0005466012 systemd-machined[152114]: New machine qemu-3-instance-00000007.
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.784 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 020b4768-a07a-4769-8636-455566c87083#033[00m
Oct  2 08:01:03 np0005466012 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.800 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a51f1c-315c-4574-ad01-ffd97df86f9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.800 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap020b4768-a1 in ovnmeta-020b4768-a07a-4769-8636-455566c87083 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.802 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap020b4768-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.802 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[66331ade-a1a9-40cc-9be2-2e2977c871a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.803 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[324c093f-0ca0-48f3-a236-46192de0c4fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 systemd-udevd[220237]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.820 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6b015f-c9d7-4b38-8d8d-a39ad73b6cd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 NetworkManager[51207]: <info>  [1759406463.8277] device (tap5562a861-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:01:03 np0005466012 NetworkManager[51207]: <info>  [1759406463.8291] device (tap5562a861-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:03Z|00037|binding|INFO|Setting lport 5562a861-2a3e-4411-8aaa-be6dde7a658a ovn-installed in OVS
Oct  2 08:01:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:03Z|00038|binding|INFO|Setting lport 5562a861-2a3e-4411-8aaa-be6dde7a658a up in Southbound
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.853 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f9842e-7524-45d0-a4b3-aa72e91f1c19]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.883 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8817df-a8d1-44d0-83c5-d8ef1a19d5aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.886 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f13067cb-405c-488f-9c5a-0046c8fe341d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 NetworkManager[51207]: <info>  [1759406463.8885] manager: (tap020b4768-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.918 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ed75f030-9141-496b-ae0c-521c05fdcc41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.921 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e020d7-841b-4170-baf4-3a687b004134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 NetworkManager[51207]: <info>  [1759406463.9454] device (tap020b4768-a0): carrier: link connected
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.951 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[a33d3649-18bd-40c8-8569-1751fc75806f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.968 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[96a3cdc6-c239-47d7-8bc0-b63f42bd9a59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap020b4768-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:d2:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445755, 'reachable_time': 44024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220269, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.982 2 DEBUG nova.network.neutron [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Updated VIF entry in instance network info cache for port 5562a861-2a3e-4411-8aaa-be6dde7a658a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:01:03 np0005466012 nova_compute[192063]: 2025-10-02 12:01:03.983 2 DEBUG nova.network.neutron [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Updating instance_info_cache with network_info: [{"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:01:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:03.988 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[712bd166-67bf-48e6-bdc5-1f6a6b4609d5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe62:d2ce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445755, 'tstamp': 445755}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220270, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.003 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[97d24d96-9f71-41ad-85b6-17fb85c794ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap020b4768-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:d2:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445755, 'reachable_time': 44024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220271, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.012 2 DEBUG oslo_concurrency.lockutils [req-3819fec7-94ea-4324-a97b-0653f0fe2337 req-1532b1cd-3c69-4764-879e-f9f39a0fc364 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.042 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b849067b-f12b-4ddc-9a3c-c366e06fe96d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.116 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[75521542-8b81-4362-aba5-1624ef74b672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.118 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap020b4768-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.119 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.119 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap020b4768-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:04 np0005466012 kernel: tap020b4768-a0: entered promiscuous mode
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:04 np0005466012 NetworkManager[51207]: <info>  [1759406464.1243] manager: (tap020b4768-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.124 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap020b4768-a0, col_values=(('external_ids', {'iface-id': '7ad14bc1-f6e9-4852-aef9-ac72c7291cba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:04Z|00039|binding|INFO|Releasing lport 7ad14bc1-f6e9-4852-aef9-ac72c7291cba from this chassis (sb_readonly=0)
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.150 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/020b4768-a07a-4769-8636-455566c87083.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/020b4768-a07a-4769-8636-455566c87083.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.151 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[465ba107-a7c0-4824-ab34-f5f2df2736c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.152 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-020b4768-a07a-4769-8636-455566c87083
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/020b4768-a07a-4769-8636-455566c87083.pid.haproxy
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 020b4768-a07a-4769-8636-455566c87083
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:01:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:01:04.152 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'env', 'PROCESS_TAG=haproxy-020b4768-a07a-4769-8636-455566c87083', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/020b4768-a07a-4769-8636-455566c87083.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.269 2 DEBUG nova.compute.manager [req-fcfb68ce-5172-4ac2-8366-8b43bb6d7784 req-6a1c13ae-4513-41b3-891a-31d0137b4332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.270 2 DEBUG oslo_concurrency.lockutils [req-fcfb68ce-5172-4ac2-8366-8b43bb6d7784 req-6a1c13ae-4513-41b3-891a-31d0137b4332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.270 2 DEBUG oslo_concurrency.lockutils [req-fcfb68ce-5172-4ac2-8366-8b43bb6d7784 req-6a1c13ae-4513-41b3-891a-31d0137b4332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.271 2 DEBUG oslo_concurrency.lockutils [req-fcfb68ce-5172-4ac2-8366-8b43bb6d7784 req-6a1c13ae-4513-41b3-891a-31d0137b4332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.271 2 DEBUG nova.compute.manager [req-fcfb68ce-5172-4ac2-8366-8b43bb6d7784 req-6a1c13ae-4513-41b3-891a-31d0137b4332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Processing event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:01:04 np0005466012 podman[220310]: 2025-10-02 12:01:04.507633168 +0000 UTC m=+0.023538826 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:01:04 np0005466012 podman[220310]: 2025-10-02 12:01:04.60844661 +0000 UTC m=+0.124352258 container create 73d766f9cbbb7bc09abdbc1a3af1dee4fa3a8a58acc13632f4e90da8036f0811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:01:04 np0005466012 systemd[1]: Started libpod-conmon-73d766f9cbbb7bc09abdbc1a3af1dee4fa3a8a58acc13632f4e90da8036f0811.scope.
Oct  2 08:01:04 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:01:04 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a3384b28e5ac852322adea306bd0fa841d90d54242cbadcf3e1b6ef02f06f97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:01:04 np0005466012 podman[220310]: 2025-10-02 12:01:04.694336707 +0000 UTC m=+0.210242375 container init 73d766f9cbbb7bc09abdbc1a3af1dee4fa3a8a58acc13632f4e90da8036f0811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:01:04 np0005466012 podman[220310]: 2025-10-02 12:01:04.700244163 +0000 UTC m=+0.216149811 container start 73d766f9cbbb7bc09abdbc1a3af1dee4fa3a8a58acc13632f4e90da8036f0811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:01:04 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220325]: [NOTICE]   (220344) : New worker (220346) forked
Oct  2 08:01:04 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220325]: [NOTICE]   (220344) : Loading success.
Oct  2 08:01:04 np0005466012 podman[220324]: 2025-10-02 12:01:04.759811393 +0000 UTC m=+0.107161133 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.762 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406464.7624617, a20c354d-a1af-4fad-958f-59623ebe4437 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.763 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] VM Started (Lifecycle Event)#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.765 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.768 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.770 2 INFO nova.virt.libvirt.driver [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Instance spawned successfully.#033[00m
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.770 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:01:04 np0005466012 podman[220321]: 2025-10-02 12:01:04.795484118 +0000 UTC m=+0.144423689 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:01:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:04Z|00040|binding|INFO|Releasing lport 7ad14bc1-f6e9-4852-aef9-ac72c7291cba from this chassis (sb_readonly=0)
Oct  2 08:01:04 np0005466012 nova_compute[192063]: 2025-10-02 12:01:04.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.171 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.176 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.230 2 DEBUG nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.355 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.355 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.356 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.356 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.357 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.357 2 DEBUG nova.virt.libvirt.driver [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.442 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.443 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406464.7625687, a20c354d-a1af-4fad-958f-59623ebe4437 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.443 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.634 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.638 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406464.767, a20c354d-a1af-4fad-958f-59623ebe4437 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.638 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.671 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.675 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.785 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:01:05 np0005466012 nova_compute[192063]: 2025-10-02 12:01:05.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.071 2 INFO nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Took 7.78 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.072 2 DEBUG nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.519 2 DEBUG nova.compute.manager [req-de19729b-59bf-4821-97cb-481713630e61 req-3d40072e-70c6-4ffc-8186-6261240c6c03 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.520 2 DEBUG oslo_concurrency.lockutils [req-de19729b-59bf-4821-97cb-481713630e61 req-3d40072e-70c6-4ffc-8186-6261240c6c03 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.520 2 DEBUG oslo_concurrency.lockutils [req-de19729b-59bf-4821-97cb-481713630e61 req-3d40072e-70c6-4ffc-8186-6261240c6c03 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.521 2 DEBUG oslo_concurrency.lockutils [req-de19729b-59bf-4821-97cb-481713630e61 req-3d40072e-70c6-4ffc-8186-6261240c6c03 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.521 2 DEBUG nova.compute.manager [req-de19729b-59bf-4821-97cb-481713630e61 req-3d40072e-70c6-4ffc-8186-6261240c6c03 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] No waiting events found dispatching network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.522 2 WARNING nova.compute.manager [req-de19729b-59bf-4821-97cb-481713630e61 req-3d40072e-70c6-4ffc-8186-6261240c6c03 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received unexpected event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a for instance with vm_state active and task_state None.#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.614 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406451.613456, 04751313-a42e-4119-953b-08c932c35ae6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.614 2 INFO nova.compute.manager [-] [instance: 04751313-a42e-4119-953b-08c932c35ae6] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.757 2 DEBUG nova.compute.manager [None req-ad373b16-fb73-491f-9c9c-b9d6fd61c308 - - - - - -] [instance: 04751313-a42e-4119-953b-08c932c35ae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:06 np0005466012 nova_compute[192063]: 2025-10-02 12:01:06.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:01:07 np0005466012 nova_compute[192063]: 2025-10-02 12:01:07.190 2 INFO nova.compute.manager [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Took 9.32 seconds to build instance.#033[00m
Oct  2 08:01:07 np0005466012 nova_compute[192063]: 2025-10-02 12:01:07.452 2 DEBUG oslo_concurrency.lockutils [None req-efc6ecc7-fa46-4323-bfcf-daa21ea52ce6 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:07 np0005466012 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct  2 08:01:07 np0005466012 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 14.840s CPU time.
Oct  2 08:01:07 np0005466012 systemd-machined[152114]: Machine qemu-2-instance-00000006 terminated.
Oct  2 08:01:07 np0005466012 nova_compute[192063]: 2025-10-02 12:01:07.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:07 np0005466012 nova_compute[192063]: 2025-10-02 12:01:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:07 np0005466012 nova_compute[192063]: 2025-10-02 12:01:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:07 np0005466012 nova_compute[192063]: 2025-10-02 12:01:07.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.245 2 INFO nova.virt.libvirt.driver [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.254 2 INFO nova.virt.libvirt.driver [-] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Instance destroyed successfully.#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.258 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.327 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.329 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.386 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.390 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Copying file /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk to 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.391 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:08 np0005466012 nova_compute[192063]: 2025-10-02 12:01:08.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.023 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.024 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.025 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.025 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.267 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "scp -r /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk" returned: 0 in 0.876s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.268 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Copying file /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.269 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk.config 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.498 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "scp -C -r /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk.config 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.config" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.499 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Copying file /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.500 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk.info 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.702 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.727 2 DEBUG oslo_concurrency.processutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "scp -C -r /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7_resize/disk.info 192.168.122.102:/var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk.info" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.794 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.795 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.863 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:01:09 np0005466012 nova_compute[192063]: 2025-10-02 12:01:09.870 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000006, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7/disk#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.052 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.054 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5614MB free_disk=73.43896865844727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.054 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.054 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.504 2 INFO nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7] Updating resource usage from migration 230f0c54-f1fd-4e2c-8967-c050cc98b321#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.531 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Migration 230f0c54-f1fd-4e2c-8967-c050cc98b321 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.532 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance a20c354d-a1af-4fad-958f-59623ebe4437 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.532 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.533 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.603 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.724 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.725 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.756 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.849 2 INFO nova.compute.rpcapi [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.851 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.993 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:01:10 np0005466012 nova_compute[192063]: 2025-10-02 12:01:10.993 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:11 np0005466012 nova_compute[192063]: 2025-10-02 12:01:11.653 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:01:11 np0005466012 nova_compute[192063]: 2025-10-02 12:01:11.654 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:01:11 np0005466012 nova_compute[192063]: 2025-10-02 12:01:11.655 2 DEBUG oslo_concurrency.lockutils [None req-c05bf7a5-151f-4ed4-8311-1ec80ed8e739 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:01:11 np0005466012 nova_compute[192063]: 2025-10-02 12:01:11.995 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:01:11 np0005466012 nova_compute[192063]: 2025-10-02 12:01:11.996 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:01:11 np0005466012 nova_compute[192063]: 2025-10-02 12:01:11.997 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:01:12 np0005466012 nova_compute[192063]: 2025-10-02 12:01:12.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:13 np0005466012 nova_compute[192063]: 2025-10-02 12:01:13.173 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:01:13 np0005466012 nova_compute[192063]: 2025-10-02 12:01:13.174 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-a20c354d-a1af-4fad-958f-59623ebe4437" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:01:13 np0005466012 nova_compute[192063]: 2025-10-02 12:01:13.174 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:01:13 np0005466012 nova_compute[192063]: 2025-10-02 12:01:13.175 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid a20c354d-a1af-4fad-958f-59623ebe4437 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:01:14 np0005466012 nova_compute[192063]: 2025-10-02 12:01:14.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:16 np0005466012 podman[220409]: 2025-10-02 12:01:16.210063615 +0000 UTC m=+0.131592080 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:01:17 np0005466012 nova_compute[192063]: 2025-10-02 12:01:17.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:18 np0005466012 podman[220453]: 2025-10-02 12:01:18.186224981 +0000 UTC m=+0.081580165 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:01:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:18Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:db:7c 10.100.0.13
Oct  2 08:01:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:01:18Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:db:7c 10.100.0.13
Oct  2 08:01:19 np0005466012 nova_compute[192063]: 2025-10-02 12:01:19.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.432 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}990e323c24a4c2b6ffa88e6c79986dae5631c4ba5e226d4603b05142d54e6d41" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.761 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1182 Content-Type: application/json Date: Thu, 02 Oct 2025 12:01:19 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-c046ad00-8680-40dc-ae2a-65dcfee3b406 x-openstack-request-id: req-c046ad00-8680-40dc-ae2a-65dcfee3b406 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.761 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1285819904", "name": "tempest-flavor_with_ephemeral_0-768942469", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1285819904"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1285819904"}]}, {"id": "990647346", "name": "tempest-flavor_with_ephemeral_1-66771630", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/990647346"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/990647346"}]}, {"id": "9949d9da-6314-4ede-8797-6f2f0a6a64fc", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}]}, {"id": "9ac83da7-f31e-4467-8569-d28002f6aeed", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.762 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-c046ad00-8680-40dc-ae2a-65dcfee3b406 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.764 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}990e323c24a4c2b6ffa88e6c79986dae5631c4ba5e226d4603b05142d54e6d41" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.831 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Thu, 02 Oct 2025 12:01:19 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2a690725-bbc0-4923-9dc7-bfdce3409b40 x-openstack-request-id: req-2a690725-bbc0-4923-9dc7-bfdce3409b40 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.831 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "9ac83da7-f31e-4467-8569-d28002f6aeed", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.831 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed used request id req-2a690725-bbc0-4923-9dc7-bfdce3409b40 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.832 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5cc73d75e0864e838eefa90cb33b7e01', 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'hostId': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.834 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7', 'name': 'tempest-MigrationsAdminTest-server-976277975', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '4dcc6c51db2640cbb04083b3336de813', 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'hostId': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.835 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.846 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.847 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.848 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b10b787a-ce71-4a74-9cb8-dd079669b749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.835139', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '81f91b00-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.511587975, 'message_signature': '5b1dc9160320789e36c9ab67d566837d2c3f7d5e041d6abe06b3be2e8df17267'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.835139', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '81f92c6c-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.511587975, 'message_signature': '780b90ade270638cb4ecb35f42dc3454bce1510de3adb073b7c53dd5171a6532'}]}, 'timestamp': '2025-10-02 12:01:19.848753', '_unique_id': '52918788e3d44feda42c2de328921716'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.854 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.857 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.875 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.876 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.876 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '149c4d2c-f762-4cd6-8afe-97e55267d3cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.857139', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '81fd7b00-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': 'f0188643ab5e0811db79ab8614db3f28409e30b4faeb8166d1a405c5003b0add'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.857139', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '81fd8564-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': 'd47e3c8e8e4a075e341890c89864a985019afb9da7fe6b68e19fe1f3e8e46a95'}]}, 'timestamp': '2025-10-02 12:01:19.876979', '_unique_id': 'ba0f2b4e08a74003a3167ed2ab3a4bd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.877 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.878 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.878 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.879 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb7decf5-d172-4ad2-8462-a855c5008624', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.878541', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '81fde4dc-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.511587975, 'message_signature': 'bb95d0b45155a00bc0d0c380a39887d9410bb47b43714914e67ef2ef7f61b6c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.878541', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '81fdefea-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.511587975, 'message_signature': '3122f38581d7ca3ea8926f1050734318ebc19b8e2f61a963868f8886af15d88b'}]}, 'timestamp': '2025-10-02 12:01:19.879860', '_unique_id': '7afee7a1f2b946d49b94c247c0855d67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.880 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.write.requests volume: 264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.881 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.881 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57aeb49f-43dd-46a1-91bf-3a3af1da293f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 264, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.880965', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '81fe429c-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': 'fe494f84506579ad3b1c591a73dfae9956775cbcf4b66a32e58dab37febd1cb6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.880965', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '81fe4a58-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': 'd4f580a61ddc6b7c63b290b9c6fea07872c64851a5a08ee096aea12fac4a58b2'}]}, 'timestamp': '2025-10-02 12:01:19.881971', '_unique_id': 'd51f1de9decf4623b05f8c9c40159c0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.882 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.883 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.write.bytes volume: 72704000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.883 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.883 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71bf50df-2e66-4140-b110-c7bb8e4d6fde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72704000, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.883075', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '81fe94fe-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': '80d8bc4f5215ea5bf28f71e04e73b607b2eec709450f575e3b111b4bd046399f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.883075', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '81fe9cba-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': 'e6c9a1f7280c739a2cff0f96a84f349b2502fc4de9d5fdd194589a1405235def'}]}, 'timestamp': '2025-10-02 12:01:19.884106', '_unique_id': 'ce7d676fb0be4979b93db794648d1c37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.884 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.887 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a20c354d-a1af-4fad-958f-59623ebe4437 / tap5562a861-2a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.888 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.888 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db5a6b1c-546a-40b4-a676-eba3fb3458df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.885367', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '81ff59ac-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '6d910f7275384b63e9571a2165f7e29d3f8d2db121b604c17aad11750f095f71'}]}, 'timestamp': '2025-10-02 12:01:19.888924', '_unique_id': 'c657688e1e324602b5f87df6028e951e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.890 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.890 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1982637812>, <NovaLikeServer: tempest-MigrationsAdminTest-server-976277975>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1982637812>, <NovaLikeServer: tempest-MigrationsAdminTest-server-976277975>]
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.890 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.890 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1982637812>, <NovaLikeServer: tempest-MigrationsAdminTest-server-976277975>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1982637812>, <NovaLikeServer: tempest-MigrationsAdminTest-server-976277975>]
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.890 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.903 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/memory.usage volume: 40.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.904 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1ec4c5c-32c4-4240-b3ee-fe2d1ddcaeeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41015625, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'timestamp': '2025-10-02T12:01:19.890834', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8201be04-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.579789324, 'message_signature': 'f9cc528fa0de884aa750fbbbf739185aafc0845c5dbe07df16b027f43693e9a0'}]}, 'timestamp': '2025-10-02 12:01:19.904704', '_unique_id': 'c34d6688e4b741f7b5974561547365e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.905 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.906 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.906 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e4476cb-32c5-4317-b5cf-e2df0b9d97a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.906124', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '82021ae8-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '44cdc948f577ea39b77dacb868b6dc5026bbdf43b38393d08728579318bb60b7'}]}, 'timestamp': '2025-10-02 12:01:19.906978', '_unique_id': 'f56cacfe470945e8804c6bfc7f93e72a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.907 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.908 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.908 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/cpu volume: 11890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.908 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5015cc7-afee-484e-85ec-9c79c0f43bc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11890000000, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'timestamp': '2025-10-02T12:01:19.908106', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '82026804-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.579789324, 'message_signature': 'a2c316be5b5b16bf4f180a903550975fec812ed4d83656546b1236c02bb1a391'}]}, 'timestamp': '2025-10-02 12:01:19.908937', '_unique_id': 'e410ac5cc4f745c9abaa7dbf0cdceb35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.910 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.incoming.bytes volume: 1562 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.910 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ef6a536-4a1c-4fd0-aed6-4e2fd851f1f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1562, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.910039', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '8202b3a4-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '288a744fd3222c833c371e979f29b78ccfd8030f28f60ccfab99a2920694c835'}]}, 'timestamp': '2025-10-02 12:01:19.910880', '_unique_id': '96079361ab3347ae87308b4db9f592d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.912 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.912 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf24ca46-33c8-4dbc-a08a-1d6dba11e142', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.911977', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '8202ff44-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '9f39856dce7bfd1a4a4dd92441f129835788f375f5402a37f0d1ab49d77d78d1'}]}, 'timestamp': '2025-10-02 12:01:19.912917', '_unique_id': 'a96d4d1d66eb4f44b79141c599831e40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.914 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.914 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50a9b986-ffbf-4a14-9226-fb975e02edaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.914034', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '82034f94-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '46027bf142edb95d2c7f7e45bd49f00cdcf4782aa92412a91d8ee9d2f0f6390d'}]}, 'timestamp': '2025-10-02 12:01:19.915015', '_unique_id': 'a5158eb8851a4857b7b9fc56b0ae6789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.915 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.916 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.916 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.916 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae08ff8e-3af6-42ed-a8e4-9351c87569de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.916077', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82039e7c-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.511587975, 'message_signature': '5c95773ae23c3808c79e2b82c453da7587682e1c964b6d4f92e4722d2dd49d54'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.916077', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8203a624-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.511587975, 'message_signature': '5c4ebc6fa8a2c49096849374085b170385d7a03b9521b66463d232312a85df77'}]}, 'timestamp': '2025-10-02 12:01:19.916984', '_unique_id': '18b1c78670714c50a1fae37f4096ebca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.918 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.read.latency volume: 1409608100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.918 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.read.latency volume: 192568437 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cced8ac-01e2-4b0c-8d68-b6fe0d7c4d6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1409608100, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.918050', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8203eb48-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': '4c2e25c991db0f42541118a81b97035fb0715f145fc7e0054e8eb120dee0788a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 192568437, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.918050', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8203f426-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': 'd498d1af750c466fe959aa961748b13d8468e56521237dbd8ab4437fa3eb40a3'}]}, 'timestamp': '2025-10-02 12:01:19.919151', '_unique_id': 'a54b4fad7545481785b52a6fc050e1d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.919 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.920 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.write.latency volume: 11626222686 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.920 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f845d5ba-3adc-481c-9718-aecac43dab9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11626222686, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-vda', 'timestamp': '2025-10-02T12:01:19.920283', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '820442be-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': '93ce887018e459d1f11a3504573f3517c954b0a949f52df2e3a0d5d058999434'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'a20c354d-a1af-4fad-958f-59623ebe4437-sda', 'timestamp': '2025-10-02T12:01:19.920283', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'instance-00000007', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82044af2-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.533642, 'message_signature': '5575896cb27ed373d149cada2e5711d4eedc5c4f55efb6b6ee8cacb134de7f68'}]}, 'timestamp': '2025-10-02 12:01:19.921232', '_unique_id': '60853a970af64b488ee69bb9c879d183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.921 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.922 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b958e5b2-06a5-4acb-8220-c9f24b36d8ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.922314', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '820491ec-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '9687ce5aeeaef9673ab756f63648d484968a7bbaddeb3b270562262970db0de6'}]}, 'timestamp': '2025-10-02 12:01:19.923107', '_unique_id': 'bc4a36ab8adc4ef0ba1a1eb65a7350bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.923 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.924 12 DEBUG ceilometer.compute.pollsters [-] a20c354d-a1af-4fad-958f-59623ebe4437/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.925 12 DEBUG ceilometer.compute.pollsters [-] Instance 0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000006, id=0eb8bd18-0bdd-4d4d-b9c5-a0375a6d6fc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f582f525-2e92-4576-ad3f-255e6b24deba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '59e8135d73ee43e088ba5ee7d9bd84b1', 'user_name': None, 'project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'project_name': None, 'resource_id': 'instance-00000007-a20c354d-a1af-4fad-958f-59623ebe4437-tap5562a861-2a', 'timestamp': '2025-10-02T12:01:19.924589', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1982637812', 'name': 'tap5562a861-2a', 'instance_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'instance_type': 'm1.nano', 'host': 'adedd6a44477702316f99b636ea8bec905bcfd19615eaec35e464679', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:09:db:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5562a861-2a'}, 'message_id': '8204f146-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4473.561806407, 'message_signature': '9b539e070b0857d63eda231386760aa1b4074feb5bb351754b3c9327d5f2c0b7'}]}, 'timestamp': '2025-10-02 12:01:19.925653', '_unique_id': '7649d54785934ea6b87e779920959eb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:01:19 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:01:19.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:02:57 np0005466012 nova_compute[192063]: 2025-10-02 12:02:57.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:57 np0005466012 podman[221553]: 2025-10-02 12:02:57.17139477 +0000 UTC m=+0.075787490 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:02:57 np0005466012 rsyslogd[1011]: imjournal: 1645 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.011 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.012 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.354 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.355 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.355 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.356 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.356 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.367 2 INFO nova.compute.manager [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Terminating instance#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.378 2 DEBUG nova.compute.manager [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:02:59 np0005466012 kernel: tap5562a861-2a (unregistering): left promiscuous mode
Oct  2 08:02:59 np0005466012 NetworkManager[51207]: <info>  [1759406579.4120] device (tap5562a861-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:02:59Z|00052|binding|INFO|Releasing lport 5562a861-2a3e-4411-8aaa-be6dde7a658a from this chassis (sb_readonly=0)
Oct  2 08:02:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:02:59Z|00053|binding|INFO|Setting lport 5562a861-2a3e-4411-8aaa-be6dde7a658a down in Southbound
Oct  2 08:02:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:02:59Z|00054|binding|INFO|Removing iface tap5562a861-2a ovn-installed in OVS
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.434 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:db:7c 10.100.0.13'], port_security=['fa:16:3e:09:db:7c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a20c354d-a1af-4fad-958f-59623ebe4437', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-020b4768-a07a-4769-8636-455566c87083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cc73d75e0864e838eefa90cb33b7e01', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'f3fadef5-4bfc-406c-93c4-14d4abd0583e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c0be75-bb4b-4e01-8cfa-b9aa4fcaf0e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5562a861-2a3e-4411-8aaa-be6dde7a658a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.437 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5562a861-2a3e-4411-8aaa-be6dde7a658a in datapath 020b4768-a07a-4769-8636-455566c87083 unbound from our chassis#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.440 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 020b4768-a07a-4769-8636-455566c87083, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.442 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[14082644-24b5-471e-bd15-43b3c489f198]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.443 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-020b4768-a07a-4769-8636-455566c87083 namespace which is not needed anymore#033[00m
Oct  2 08:02:59 np0005466012 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct  2 08:02:59 np0005466012 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 5.068s CPU time.
Oct  2 08:02:59 np0005466012 systemd-machined[152114]: Machine qemu-4-instance-00000007 terminated.
Oct  2 08:02:59 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220954]: [NOTICE]   (220958) : haproxy version is 2.8.14-c23fe91
Oct  2 08:02:59 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220954]: [NOTICE]   (220958) : path to executable is /usr/sbin/haproxy
Oct  2 08:02:59 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220954]: [WARNING]  (220958) : Exiting Master process...
Oct  2 08:02:59 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220954]: [WARNING]  (220958) : Exiting Master process...
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.637 2 INFO nova.virt.libvirt.driver [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Instance destroyed successfully.#033[00m
Oct  2 08:02:59 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220954]: [ALERT]    (220958) : Current worker (220960) exited with code 143 (Terminated)
Oct  2 08:02:59 np0005466012 neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083[220954]: [WARNING]  (220958) : All workers exited. Exiting... (0)
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.638 2 DEBUG nova.objects.instance [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lazy-loading 'resources' on Instance uuid a20c354d-a1af-4fad-958f-59623ebe4437 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:02:59 np0005466012 systemd[1]: libpod-05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22.scope: Deactivated successfully.
Oct  2 08:02:59 np0005466012 conmon[220954]: conmon 05adcdebc6d6d33c4a72 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22.scope/container/memory.events
Oct  2 08:02:59 np0005466012 podman[221598]: 2025-10-02 12:02:59.649224116 +0000 UTC m=+0.084555262 container died 05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.654 2 DEBUG nova.virt.libvirt.vif [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:00:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1982637812',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1982637812',id=7,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:01:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5cc73d75e0864e838eefa90cb33b7e01',ramdisk_id='',reservation_id='r-bvhrjcj5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-984573444',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-984573444-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:01:59Z,user_data=None,user_id='59e8135d73ee43e088ba5ee7d9bd84b1',uuid=a20c354d-a1af-4fad-958f-59623ebe4437,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.654 2 DEBUG nova.network.os_vif_util [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converting VIF {"id": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "address": "fa:16:3e:09:db:7c", "network": {"id": "020b4768-a07a-4769-8636-455566c87083", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-804372870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cc73d75e0864e838eefa90cb33b7e01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5562a861-2a", "ovs_interfaceid": "5562a861-2a3e-4411-8aaa-be6dde7a658a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.655 2 DEBUG nova.network.os_vif_util [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.656 2 DEBUG os_vif [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.658 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5562a861-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.664 2 INFO os_vif [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:db:7c,bridge_name='br-int',has_traffic_filtering=True,id=5562a861-2a3e-4411-8aaa-be6dde7a658a,network=Network(020b4768-a07a-4769-8636-455566c87083),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5562a861-2a')#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.665 2 INFO nova.virt.libvirt.driver [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Deleting instance files /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437_del#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.666 2 INFO nova.virt.libvirt.driver [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Deletion of /var/lib/nova/instances/a20c354d-a1af-4fad-958f-59623ebe4437_del complete#033[00m
Oct  2 08:02:59 np0005466012 systemd[1]: var-lib-containers-storage-overlay-708b1af0eee3eec90d1b37d962b8649b877cf640a2cb6b319724c57313c186f0-merged.mount: Deactivated successfully.
Oct  2 08:02:59 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22-userdata-shm.mount: Deactivated successfully.
Oct  2 08:02:59 np0005466012 podman[221598]: 2025-10-02 12:02:59.727808709 +0000 UTC m=+0.163139845 container cleanup 05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:02:59 np0005466012 systemd[1]: libpod-conmon-05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22.scope: Deactivated successfully.
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.748 2 INFO nova.compute.manager [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.748 2 DEBUG oslo.service.loopingcall [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.749 2 DEBUG nova.compute.manager [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.749 2 DEBUG nova.network.neutron [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.817 2 DEBUG nova.compute.manager [req-9e8d3f36-e0ed-46b5-b5f5-12e651c82b37 req-98f22eca-58bd-4c04-a081-128dcdb7ac11 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-vif-unplugged-5562a861-2a3e-4411-8aaa-be6dde7a658a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.818 2 DEBUG oslo_concurrency.lockutils [req-9e8d3f36-e0ed-46b5-b5f5-12e651c82b37 req-98f22eca-58bd-4c04-a081-128dcdb7ac11 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.818 2 DEBUG oslo_concurrency.lockutils [req-9e8d3f36-e0ed-46b5-b5f5-12e651c82b37 req-98f22eca-58bd-4c04-a081-128dcdb7ac11 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.819 2 DEBUG oslo_concurrency.lockutils [req-9e8d3f36-e0ed-46b5-b5f5-12e651c82b37 req-98f22eca-58bd-4c04-a081-128dcdb7ac11 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.819 2 DEBUG nova.compute.manager [req-9e8d3f36-e0ed-46b5-b5f5-12e651c82b37 req-98f22eca-58bd-4c04-a081-128dcdb7ac11 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] No waiting events found dispatching network-vif-unplugged-5562a861-2a3e-4411-8aaa-be6dde7a658a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.819 2 DEBUG nova.compute.manager [req-9e8d3f36-e0ed-46b5-b5f5-12e651c82b37 req-98f22eca-58bd-4c04-a081-128dcdb7ac11 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-vif-unplugged-5562a861-2a3e-4411-8aaa-be6dde7a658a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:02:59 np0005466012 podman[221645]: 2025-10-02 12:02:59.820156097 +0000 UTC m=+0.069607475 container remove 05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.827 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[578b3f24-c27d-4c16-bba7-63a3019845c4]: (4, ('Thu Oct  2 12:02:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083 (05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22)\n05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22\nThu Oct  2 12:02:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-020b4768-a07a-4769-8636-455566c87083 (05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22)\n05adcdebc6d6d33c4a72531759f445bfd99e9b49654c07aa8613bca65b7f3a22\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.829 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[daee7826-9d38-49c9-a540-005a3c3445c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.830 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap020b4768-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:02:59 np0005466012 kernel: tap020b4768-a0: left promiscuous mode
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.838 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[900e592d-d468-4fa5-8ee8-fc3a5a912c07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 nova_compute[192063]: 2025-10-02 12:02:59.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.871 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c309c24d-b402-4189-b320-0c7c1085a243]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.872 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[11ef4952-09d3-4d04-b116-8ea30f87d755]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.883 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0a82cb-6f56-40c1-b531-a34aa10ba51e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450756, 'reachable_time': 15854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221661, 'error': None, 'target': 'ovnmeta-020b4768-a07a-4769-8636-455566c87083', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.886 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-020b4768-a07a-4769-8636-455566c87083 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:02:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:02:59.887 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[629984ed-695f-4a4d-96b6-946d921be4bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:02:59 np0005466012 systemd[1]: run-netns-ovnmeta\x2d020b4768\x2da07a\x2d4769\x2d8636\x2d455566c87083.mount: Deactivated successfully.
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.470 2 DEBUG nova.network.neutron [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.519 2 INFO nova.compute.manager [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Took 0.77 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.565 2 DEBUG nova.compute.manager [req-8338903c-1b9e-4c2a-a951-907c72528208 req-5a4c972d-5428-4119-96d2-c46d9f358968 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-vif-deleted-5562a861-2a3e-4411-8aaa-be6dde7a658a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.648 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.649 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.726 2 DEBUG nova.compute.provider_tree [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.743 2 DEBUG nova.scheduler.client.report [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.761 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.785 2 INFO nova.scheduler.client.report [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Deleted allocations for instance a20c354d-a1af-4fad-958f-59623ebe4437#033[00m
Oct  2 08:03:00 np0005466012 nova_compute[192063]: 2025-10-02 12:03:00.859 2 DEBUG oslo_concurrency.lockutils [None req-195fd6db-7e5d-4dee-9ef3-7bc06371630b 59e8135d73ee43e088ba5ee7d9bd84b1 5cc73d75e0864e838eefa90cb33b7e01 - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.154 2 DEBUG nova.virt.libvirt.driver [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.988 2 DEBUG nova.compute.manager [req-85b491df-93dd-4606-889d-b36da1c1d764 req-38ddbf21-71c1-4207-9668-5c14f4400034 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.989 2 DEBUG oslo_concurrency.lockutils [req-85b491df-93dd-4606-889d-b36da1c1d764 req-38ddbf21-71c1-4207-9668-5c14f4400034 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.989 2 DEBUG oslo_concurrency.lockutils [req-85b491df-93dd-4606-889d-b36da1c1d764 req-38ddbf21-71c1-4207-9668-5c14f4400034 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.989 2 DEBUG oslo_concurrency.lockutils [req-85b491df-93dd-4606-889d-b36da1c1d764 req-38ddbf21-71c1-4207-9668-5c14f4400034 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a20c354d-a1af-4fad-958f-59623ebe4437-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.989 2 DEBUG nova.compute.manager [req-85b491df-93dd-4606-889d-b36da1c1d764 req-38ddbf21-71c1-4207-9668-5c14f4400034 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] No waiting events found dispatching network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:03:01 np0005466012 nova_compute[192063]: 2025-10-02 12:03:01.989 2 WARNING nova.compute.manager [req-85b491df-93dd-4606-889d-b36da1c1d764 req-38ddbf21-71c1-4207-9668-5c14f4400034 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Received unexpected event network-vif-plugged-5562a861-2a3e-4411-8aaa-be6dde7a658a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:03:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:03:02.110 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:03:02.111 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:03:02.111 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:02 np0005466012 podman[221662]: 2025-10-02 12:03:02.17108828 +0000 UTC m=+0.085086297 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:03:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:03:03.014 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:03:03 np0005466012 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct  2 08:03:03 np0005466012 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Consumed 13.039s CPU time.
Oct  2 08:03:03 np0005466012 systemd-machined[152114]: Machine qemu-8-instance-0000000d terminated.
Oct  2 08:03:03 np0005466012 nova_compute[192063]: 2025-10-02 12:03:03.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.169 2 INFO nova.virt.libvirt.driver [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.176 2 INFO nova.virt.libvirt.driver [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance destroyed successfully.#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.179 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.270 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.272 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.334 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.336 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Copying file /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk to 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.337 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:04 np0005466012 nova_compute[192063]: 2025-10-02 12:03:04.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:05 np0005466012 podman[221699]: 2025-10-02 12:03:05.162926562 +0000 UTC m=+0.068880527 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.175 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "scp -r /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk" returned: 0 in 0.839s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.176 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Copying file /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.177 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk.config 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.455 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "scp -C -r /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk.config 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.config" returned: 0 in 0.278s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.456 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Copying file /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.457 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk.info 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.705 2 DEBUG oslo_concurrency.processutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] CMD "scp -C -r /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_resize/disk.info 192.168.122.100:/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.info" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.855 2 DEBUG oslo_concurrency.lockutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Acquiring lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.857 2 DEBUG oslo_concurrency.lockutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:05 np0005466012 nova_compute[192063]: 2025-10-02 12:03:05.857 2 DEBUG oslo_concurrency.lockutils [None req-5199b18d-0a27-4885-9422-69d2bb74ff61 a9d1adde5fcb4d3bab833619b44f7a7c 71d9b13feff24ebd81a067d702973a51 - - default default] Lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:07 np0005466012 podman[221724]: 2025-10-02 12:03:07.159449029 +0000 UTC m=+0.065043895 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:03:07 np0005466012 podman[221723]: 2025-10-02 12:03:07.162720135 +0000 UTC m=+0.079840867 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:03:07 np0005466012 nova_compute[192063]: 2025-10-02 12:03:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:07 np0005466012 nova_compute[192063]: 2025-10-02 12:03:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:07 np0005466012 nova_compute[192063]: 2025-10-02 12:03:07.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.850 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.939 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000d, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk#033[00m
Oct  2 08:03:09 np0005466012 nova_compute[192063]: 2025-10-02 12:03:09.943 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/51680613-d2fe-44ac-8f89-432ec74a9541/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.018 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/51680613-d2fe-44ac-8f89-432ec74a9541/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.019 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/51680613-d2fe-44ac-8f89-432ec74a9541/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.081 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/51680613-d2fe-44ac-8f89-432ec74a9541/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.224 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.226 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5572MB free_disk=73.41067123413086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.226 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.226 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.310 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 51680613-d2fe-44ac-8f89-432ec74a9541 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.339 2 INFO nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance f6996b59-9ec1-4f50-847f-7511c618a4bb has allocations against this compute host but is not found in the database.#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.340 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.340 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.420 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.435 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.461 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.462 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.495 2 INFO nova.compute.manager [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Swapping old allocation on dict_keys(['ddb6f967-9a8a-4554-9b44-b99536054f9c']) held by migration f6996b59-9ec1-4f50-847f-7511c618a4bb for instance#033[00m
Oct  2 08:03:10 np0005466012 nova_compute[192063]: 2025-10-02 12:03:10.529 2 DEBUG nova.scheduler.client.report [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Overwriting current allocation {'allocations': {'55f2ae21-42ea-47d7-8c73-c3134981d708': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 19}}, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'consumer_generation': 1} on consumer 73cd9aef-a159-4d0e-9fc4-435f191db0b9 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.099 2 DEBUG oslo_concurrency.lockutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "refresh_cache-73cd9aef-a159-4d0e-9fc4-435f191db0b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.099 2 DEBUG oslo_concurrency.lockutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquired lock "refresh_cache-73cd9aef-a159-4d0e-9fc4-435f191db0b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.100 2 DEBUG nova.network.neutron [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.373 2 DEBUG nova.network.neutron [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.461 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.461 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.462 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.708 2 DEBUG nova.network.neutron [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.728 2 DEBUG oslo_concurrency.lockutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Releasing lock "refresh_cache-73cd9aef-a159-4d0e-9fc4-435f191db0b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.729 2 DEBUG nova.virt.libvirt.driver [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.734 2 DEBUG nova.virt.libvirt.driver [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.737 2 WARNING nova.virt.libvirt.driver [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.740 2 DEBUG nova.virt.libvirt.host [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.741 2 DEBUG nova.virt.libvirt.host [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.743 2 DEBUG nova.virt.libvirt.host [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.744 2 DEBUG nova.virt.libvirt.host [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.744 2 DEBUG nova.virt.libvirt.driver [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.745 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.745 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.745 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.745 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.746 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.746 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.747 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.748 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.748 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.748 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.748 2 DEBUG nova.virt.hardware [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.749 2 DEBUG nova.objects.instance [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 73cd9aef-a159-4d0e-9fc4-435f191db0b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.820 2 DEBUG oslo_concurrency.processutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.874 2 DEBUG oslo_concurrency.processutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.config --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.875 2 DEBUG oslo_concurrency.lockutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.875 2 DEBUG oslo_concurrency.lockutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.876 2 DEBUG oslo_concurrency.lockutils [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:11 np0005466012 nova_compute[192063]: 2025-10-02 12:03:11.879 2 DEBUG nova.virt.libvirt.driver [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <uuid>73cd9aef-a159-4d0e-9fc4-435f191db0b9</uuid>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <name>instance-0000000d</name>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:name>tempest-MigrationsAdminTest-server-1487341678</nova:name>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:03:11</nova:creationTime>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:user uuid="8da35688aa864e189f10b334a21bc6c4">tempest-MigrationsAdminTest-1651504538-project-member</nova:user>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:        <nova:project uuid="4dcc6c51db2640cbb04083b3336de813">tempest-MigrationsAdminTest-1651504538</nova:project>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <entry name="serial">73cd9aef-a159-4d0e-9fc4-435f191db0b9</entry>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <entry name="uuid">73cd9aef-a159-4d0e-9fc4-435f191db0b9</entry>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/disk.config"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9/console.log" append="off"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:03:11 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:03:11 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:03:11 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:03:11 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:03:11 np0005466012 systemd-machined[152114]: New machine qemu-9-instance-0000000d.
Oct  2 08:03:11 np0005466012 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.733 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 73cd9aef-a159-4d0e-9fc4-435f191db0b9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.734 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406592.7329934, 73cd9aef-a159-4d0e-9fc4-435f191db0b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.734 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.736 2 DEBUG nova.compute.manager [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.740 2 INFO nova.virt.libvirt.driver [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance running successfully.#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.742 2 DEBUG nova.virt.libvirt.driver [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.754 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.758 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.791 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.791 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406592.7345, 73cd9aef-a159-4d0e-9fc4-435f191db0b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.792 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] VM Started (Lifecycle Event)#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.813 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.818 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.885 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.891 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Oct  2 08:03:12 np0005466012 nova_compute[192063]: 2025-10-02 12:03:12.894 2 INFO nova.compute.manager [None req-1bec2035-43e5-47a2-8ad9-4d711e23d226 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Updating instance to original state: 'active'#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.439 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.439 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.440 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.440 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.440 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.451 2 INFO nova.compute.manager [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Terminating instance#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.464 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "refresh_cache-73cd9aef-a159-4d0e-9fc4-435f191db0b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.465 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquired lock "refresh_cache-73cd9aef-a159-4d0e-9fc4-435f191db0b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.465 2 DEBUG nova.network.neutron [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.635 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406579.6338568, a20c354d-a1af-4fad-958f-59623ebe4437 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.635 2 INFO nova.compute.manager [-] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.652 2 DEBUG nova.compute.manager [None req-b91ea1d2-fe1e-4f17-8ef9-a1e9ff958ef5 - - - - - -] [instance: a20c354d-a1af-4fad-958f-59623ebe4437] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:14 np0005466012 nova_compute[192063]: 2025-10-02 12:03:14.932 2 DEBUG nova.network.neutron [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.339 2 DEBUG nova.network.neutron [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.362 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Releasing lock "refresh_cache-73cd9aef-a159-4d0e-9fc4-435f191db0b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.364 2 DEBUG nova.compute.manager [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:03:15 np0005466012 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct  2 08:03:15 np0005466012 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 3.436s CPU time.
Oct  2 08:03:15 np0005466012 systemd-machined[152114]: Machine qemu-9-instance-0000000d terminated.
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.615 2 INFO nova.virt.libvirt.driver [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance destroyed successfully.#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.616 2 DEBUG nova.objects.instance [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'resources' on Instance uuid 73cd9aef-a159-4d0e-9fc4-435f191db0b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.640 2 INFO nova.virt.libvirt.driver [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Deleting instance files /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_del#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.647 2 INFO nova.virt.libvirt.driver [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Deletion of /var/lib/nova/instances/73cd9aef-a159-4d0e-9fc4-435f191db0b9_del complete#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.728 2 INFO nova.compute.manager [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.729 2 DEBUG oslo.service.loopingcall [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.729 2 DEBUG nova.compute.manager [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.730 2 DEBUG nova.network.neutron [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.944 2 DEBUG nova.network.neutron [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.956 2 DEBUG nova.network.neutron [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:15 np0005466012 nova_compute[192063]: 2025-10-02 12:03:15.969 2 INFO nova.compute.manager [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Took 0.24 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:16 np0005466012 nova_compute[192063]: 2025-10-02 12:03:16.052 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:16 np0005466012 nova_compute[192063]: 2025-10-02 12:03:16.053 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:16 np0005466012 nova_compute[192063]: 2025-10-02 12:03:16.059 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:16 np0005466012 nova_compute[192063]: 2025-10-02 12:03:16.097 2 INFO nova.scheduler.client.report [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Deleted allocations for instance 73cd9aef-a159-4d0e-9fc4-435f191db0b9#033[00m
Oct  2 08:03:16 np0005466012 nova_compute[192063]: 2025-10-02 12:03:16.182 2 DEBUG oslo_concurrency.lockutils [None req-2ffbfffa-b873-43bc-9841-66a6a57d3b16 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "73cd9aef-a159-4d0e-9fc4-435f191db0b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:16.918 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}990e323c24a4c2b6ffa88e6c79986dae5631c4ba5e226d4603b05142d54e6d41" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.070 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 02 Oct 2025 12:03:16 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-1ad37216-2ad2-404c-827e-4115b1d347fa x-openstack-request-id: req-1ad37216-2ad2-404c-827e-4115b1d347fa _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.070 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "9949d9da-6314-4ede-8797-6f2f0a6a64fc", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9949d9da-6314-4ede-8797-6f2f0a6a64fc"}]}, {"id": "9ac83da7-f31e-4467-8569-d28002f6aeed", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9ac83da7-f31e-4467-8569-d28002f6aeed"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.070 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-1ad37216-2ad2-404c-827e-4115b1d347fa request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.072 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'name': 'tempest-MigrationsAdminTest-server-1929616673', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4dcc6c51db2640cbb04083b3336de813', 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'hostId': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.072 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.098 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.099 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05f591d7-548c-4239-85ad-00b08ee886a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.076284', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7dc5f42-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': '1bdc1eaf286293282386aa988687e163aa5320a3d2dfeaa08c878cf83f7a24ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.076284', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7dc711c-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': '4c38f88cd4fb8b41c09d3e3a0caebc2152d00ec86eb7ab0e37628524a4d545fc'}]}, 'timestamp': '2025-10-02 12:03:17.100121', '_unique_id': '467f673660ce4f3d9cd1f261b33b8bc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.102 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.104 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.read.requests volume: 1217 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.104 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4049f27-0064-4b73-8e9a-77e51fb60649', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1217, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.104174', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7dd1c48-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'f3b347346e8a320ae2661f46042a5cda6dd8647cc3f3f0cffb46eda74b69204e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.104174', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7dd27e2-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'c90ba10f77a60a48687e068cbbc8a233f71c9e7ad505f203e5916b3a3f5619a1'}]}, 'timestamp': '2025-10-02 12:03:17.104772', '_unique_id': '03d1a349f1804521a9ec63fd36c38be4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.106 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.106 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>]
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.106 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.read.bytes volume: 32065536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.106 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40b5f273-f4a1-483d-926d-955caddac4b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32065536, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.106437', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7dd7288-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'da1c2dc48fd03c839e2499b17753e7690b600c874483ee3acbea1a6cd14ad10e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.106437', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7dd7abc-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': '1849444bdcaa0d49d9cb5bcd69e5dfa06a38db6afa24024b4364ad018614288c'}]}, 'timestamp': '2025-10-02 12:03:17.106850', '_unique_id': '313a830cda264c30a8ecc76eb2f95c7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.107 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.125 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/cpu volume: 11570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4fc3a67-1ebb-4ec0-95eb-fe9f58671508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11570000000, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'timestamp': '2025-10-02T12:03:17.108004', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c7e06132-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.801614211, 'message_signature': '17ebde40d9cfe9d0cebe7a4e97d6cd71749e819ee01d860b4c512327651debdb'}]}, 'timestamp': '2025-10-02 12:03:17.125955', '_unique_id': '521b2b4bbfb24b8c81e7bfa7f439ccda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.126 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.128 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.128 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/memory.usage volume: 40.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2337b729-3207-4a02-8376-329842ae1645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.76953125, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'timestamp': '2025-10-02T12:03:17.128309', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c7e0ca8c-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.801614211, 'message_signature': 'cb0eee2df20952fe2fd9fe86cd9aa8193aa5b6d9190ffbc316969fd3dadcfda7'}]}, 'timestamp': '2025-10-02 12:03:17.128605', '_unique_id': 'f41f892ccb4946938746de8fd385280b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.130 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.read.latency volume: 834088656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.130 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.read.latency volume: 43942128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6aac6610-8b85-484a-9e0e-bca799229f5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 834088656, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.130048', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7e10ee8-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'd868cc2b8dfa23084efd92a31f7e8242954b6c915ce00fb11d06bb9bc70d9bf5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43942128, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.130048', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7e117f8-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'f8c299f76884c42582c4c1c66ef232a1e25cf49313c98efd90b23d17cb148f17'}]}, 'timestamp': '2025-10-02 12:03:17.130538', '_unique_id': '6603f23891a94af99605e7eda0bdeb05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.140 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.141 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8450603-fee6-4b84-975e-d88051410d0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.131985', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7e2c292-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.808359919, 'message_signature': '5c5a43b936ff445407bdf0f903b5f7c4f0073d60bdb8ee188510a33e2f6a8fae'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.131985', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7e2d106-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.808359919, 'message_signature': '987d616096f6383057957c53a3f0e2ac07c6f85a4a8322c6b6edfd4e8ca4c7bc'}]}, 'timestamp': '2025-10-02 12:03:17.141844', '_unique_id': '4f887016f1fd48f99255107aa09356b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.142 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.144 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>]
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.144 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>]
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.145 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1929616673>]
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.145 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.145 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98ae59c9-408f-42b4-8493-6403df9bb696', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.145342', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7e36404-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.808359919, 'message_signature': '77afe5efefc7fe0a559a9dd7b06fdeb7873d2b56481862fed34d39e8fc312a69'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.145342', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7e36f44-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.808359919, 'message_signature': 'fc959d810020b267772d31da70049c0ab474c491302aa016dbcc466e8c816603'}]}, 'timestamp': '2025-10-02 12:03:17.145886', '_unique_id': '1e9dcba6cc7440e8857ad5de7195a5df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.146 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.147 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.write.latency volume: 36933024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.147 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec1c356f-c729-41c2-b3c3-6b1f9b2cb976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 36933024, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.147556', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7e3bf26-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'fe4d2dfeb48e1b929d6ab29606e1049d97d86a9cce590783b8671d36d12bd533'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.147556', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7e3c944-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'c33c0fc4bfb13434fd749ed49a6be121efaea3c7f7f6e2cdeff37e1e9d639ff0'}]}, 'timestamp': '2025-10-02 12:03:17.148189', '_unique_id': '4726c4bccf0a4ec19f801023efe62930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.148 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.149 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fd9536b-37b1-4fdf-9794-94adf23929c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.149839', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7e413c2-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.808359919, 'message_signature': '5dcee8cabaf37f55ced2d0b7127a40bcbfb12ac4e10b590c11e7d3ea1205ea32'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.149839', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7e41ec6-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.808359919, 'message_signature': '7d5aecca1149631dcc8e3946cb8655190ec964e96374d335a08470348fad7254'}]}, 'timestamp': '2025-10-02 12:03:17.150411', '_unique_id': '41c8aea069464037b64b83c05f14f028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.150 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.152 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.write.bytes volume: 258048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.152 12 DEBUG ceilometer.compute.pollsters [-] 51680613-d2fe-44ac-8f89-432ec74a9541/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adb33b74-f3ab-4a4d-aea2-009b570ad69e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 258048, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-vda', 'timestamp': '2025-10-02T12:03:17.152071', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c7e46cfa-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'cd6239f4f82f3d15aed4bd2f3f302e2bd4215db15408023dbdc70f97845630d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8da35688aa864e189f10b334a21bc6c4', 'user_name': None, 'project_id': '4dcc6c51db2640cbb04083b3336de813', 'project_name': None, 'resource_id': '51680613-d2fe-44ac-8f89-432ec74a9541-sda', 'timestamp': '2025-10-02T12:03:17.152071', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1929616673', 'name': 'instance-0000000a', 'instance_id': '51680613-d2fe-44ac-8f89-432ec74a9541', 'instance_type': 'tempest-test_resize_flavor_-653413138', 'host': '8d4f059ea5030ee2aaea140b7e158feb6b6bbb1b85321e03f7f79bb0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-653413138', 'name': 'tempest-test_resize_flavor_-653413138', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c7e47704-9f87-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4590.752687603, 'message_signature': 'eb9b321bd1bb55b58a2deead8a079093523ae0609f21367e092fac6e260832ef'}]}, 'timestamp': '2025-10-02 12:03:17.152633', '_unique_id': '03786e001ff046f59d14bf5cdfd2b2af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:03:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:03:17.153 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.851 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "51680613-d2fe-44ac-8f89-432ec74a9541" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.851 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "51680613-d2fe-44ac-8f89-432ec74a9541" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.851 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "51680613-d2fe-44ac-8f89-432ec74a9541-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.852 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "51680613-d2fe-44ac-8f89-432ec74a9541-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.852 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "51680613-d2fe-44ac-8f89-432ec74a9541-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.867 2 INFO nova.compute.manager [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Terminating instance#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.877 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "refresh_cache-51680613-d2fe-44ac-8f89-432ec74a9541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.878 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquired lock "refresh_cache-51680613-d2fe-44ac-8f89-432ec74a9541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:17 np0005466012 nova_compute[192063]: 2025-10-02 12:03:17.878 2 DEBUG nova.network.neutron [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.044 2 DEBUG nova.network.neutron [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.355 2 DEBUG nova.network.neutron [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.369 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Releasing lock "refresh_cache-51680613-d2fe-44ac-8f89-432ec74a9541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.370 2 DEBUG nova.compute.manager [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:03:18 np0005466012 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct  2 08:03:18 np0005466012 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 14.680s CPU time.
Oct  2 08:03:18 np0005466012 systemd-machined[152114]: Machine qemu-7-instance-0000000a terminated.
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.612 2 INFO nova.virt.libvirt.driver [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Instance destroyed successfully.#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.613 2 DEBUG nova.objects.instance [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lazy-loading 'resources' on Instance uuid 51680613-d2fe-44ac-8f89-432ec74a9541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.630 2 INFO nova.virt.libvirt.driver [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Deleting instance files /var/lib/nova/instances/51680613-d2fe-44ac-8f89-432ec74a9541_del#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.630 2 INFO nova.virt.libvirt.driver [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Deletion of /var/lib/nova/instances/51680613-d2fe-44ac-8f89-432ec74a9541_del complete#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.701 2 INFO nova.compute.manager [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.701 2 DEBUG oslo.service.loopingcall [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.701 2 DEBUG nova.compute.manager [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.701 2 DEBUG nova.network.neutron [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.938 2 DEBUG nova.network.neutron [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.952 2 DEBUG nova.network.neutron [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:18 np0005466012 nova_compute[192063]: 2025-10-02 12:03:18.967 2 INFO nova.compute.manager [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Took 0.27 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.046 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.046 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.088 2 DEBUG nova.compute.provider_tree [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.105 2 DEBUG nova.scheduler.client.report [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.124 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.159 2 INFO nova.scheduler.client.report [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Deleted allocations for instance 51680613-d2fe-44ac-8f89-432ec74a9541#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.225 2 DEBUG oslo_concurrency.lockutils [None req-82c623f9-5904-4dca-b369-f925db912aee 8da35688aa864e189f10b334a21bc6c4 4dcc6c51db2640cbb04083b3336de813 - - default default] Lock "51680613-d2fe-44ac-8f89-432ec74a9541" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.851 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "99a100e5-2387-482b-91a2-55833b357c26" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.852 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "99a100e5-2387-482b-91a2-55833b357c26" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.884 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.980 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.981 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.989 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:03:19 np0005466012 nova_compute[192063]: 2025-10-02 12:03:19.990 2 INFO nova.compute.claims [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.149 2 DEBUG nova.compute.provider_tree [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.171 2 DEBUG nova.scheduler.client.report [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:20 np0005466012 podman[221820]: 2025-10-02 12:03:20.19174991 +0000 UTC m=+0.106471803 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.195 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.196 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.243 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.243 2 DEBUG nova.network.neutron [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.276 2 INFO nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.309 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.466 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.467 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.468 2 INFO nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Creating image(s)#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.468 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "/var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.468 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "/var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.469 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "/var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.483 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.560 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.561 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.562 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.573 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.627 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.628 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.746 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk 1073741824" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.747 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.747 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.786 2 DEBUG nova.network.neutron [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.786 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.801 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.802 2 DEBUG nova.virt.disk.api [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Checking if we can resize image /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.802 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.857 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.858 2 DEBUG nova.virt.disk.api [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Cannot resize image /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.859 2 DEBUG nova.objects.instance [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 99a100e5-2387-482b-91a2-55833b357c26 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.882 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.882 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Ensure instance console log exists: /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.883 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.883 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.883 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.885 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.889 2 WARNING nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.895 2 DEBUG nova.virt.libvirt.host [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.896 2 DEBUG nova.virt.libvirt.host [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.903 2 DEBUG nova.virt.libvirt.host [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.903 2 DEBUG nova.virt.libvirt.host [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.906 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.906 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.907 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.907 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.908 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.909 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.909 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.910 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.910 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.911 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.911 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.912 2 DEBUG nova.virt.hardware [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.920 2 DEBUG nova.objects.instance [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99a100e5-2387-482b-91a2-55833b357c26 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:20 np0005466012 nova_compute[192063]: 2025-10-02 12:03:20.939 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <uuid>99a100e5-2387-482b-91a2-55833b357c26</uuid>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <name>instance-00000011</name>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1431646342</nova:name>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:03:20</nova:creationTime>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:user uuid="d27eb44762f548fc96a3f2edcdb5537c">tempest-ServersOnMultiNodesTest-1227449327-project-member</nova:user>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:        <nova:project uuid="df2cf2fcc379455c90e6044b60e603c0">tempest-ServersOnMultiNodesTest-1227449327</nova:project>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <entry name="serial">99a100e5-2387-482b-91a2-55833b357c26</entry>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <entry name="uuid">99a100e5-2387-482b-91a2-55833b357c26</entry>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.config"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/console.log" append="off"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:03:20 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:03:20 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:03:20 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:03:20 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:03:21 np0005466012 nova_compute[192063]: 2025-10-02 12:03:21.002 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:03:21 np0005466012 nova_compute[192063]: 2025-10-02 12:03:21.003 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:03:21 np0005466012 nova_compute[192063]: 2025-10-02 12:03:21.003 2 INFO nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Using config drive#033[00m
Oct  2 08:03:21 np0005466012 nova_compute[192063]: 2025-10-02 12:03:21.205 2 INFO nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Creating config drive at /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.config#033[00m
Oct  2 08:03:21 np0005466012 nova_compute[192063]: 2025-10-02 12:03:21.210 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnm9wlw79 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:21 np0005466012 nova_compute[192063]: 2025-10-02 12:03:21.353 2 DEBUG oslo_concurrency.processutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnm9wlw79" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:21 np0005466012 systemd-machined[152114]: New machine qemu-10-instance-00000011.
Oct  2 08:03:21 np0005466012 systemd[1]: Started Virtual Machine qemu-10-instance-00000011.
Oct  2 08:03:21 np0005466012 podman[221872]: 2025-10-02 12:03:21.495613836 +0000 UTC m=+0.073218962 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.143 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406602.1430717, 99a100e5-2387-482b-91a2-55833b357c26 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.143 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.146 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.146 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.160 2 INFO nova.virt.libvirt.driver [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Instance spawned successfully.#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.161 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.190 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.193 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.207 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.208 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.208 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.209 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.209 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.210 2 DEBUG nova.virt.libvirt.driver [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.218 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.218 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406602.1458185, 99a100e5-2387-482b-91a2-55833b357c26 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.219 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] VM Started (Lifecycle Event)#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.249 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.252 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.274 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.296 2 INFO nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Took 1.83 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.297 2 DEBUG nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.388 2 INFO nova.compute.manager [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Took 2.44 seconds to build instance.#033[00m
Oct  2 08:03:22 np0005466012 nova_compute[192063]: 2025-10-02 12:03:22.417 2 DEBUG oslo_concurrency.lockutils [None req-99f98da4-7d15-48f0-a443-fe5d7f23691a d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "99a100e5-2387-482b-91a2-55833b357c26" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:24 np0005466012 nova_compute[192063]: 2025-10-02 12:03:24.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:24 np0005466012 podman[221912]: 2025-10-02 12:03:24.405477336 +0000 UTC m=+0.064363428 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:03:24 np0005466012 nova_compute[192063]: 2025-10-02 12:03:24.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:28 np0005466012 podman[221933]: 2025-10-02 12:03:28.175515877 +0000 UTC m=+0.083017301 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm)
Oct  2 08:03:29 np0005466012 nova_compute[192063]: 2025-10-02 12:03:29.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:29 np0005466012 nova_compute[192063]: 2025-10-02 12:03:29.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:30 np0005466012 nova_compute[192063]: 2025-10-02 12:03:30.614 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406595.6127343, 73cd9aef-a159-4d0e-9fc4-435f191db0b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:30 np0005466012 nova_compute[192063]: 2025-10-02 12:03:30.615 2 INFO nova.compute.manager [-] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:03:30 np0005466012 nova_compute[192063]: 2025-10-02 12:03:30.659 2 DEBUG nova.compute.manager [None req-834e4b11-ba7d-4d65-8e03-4093f0700c20 - - - - - -] [instance: 73cd9aef-a159-4d0e-9fc4-435f191db0b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:33 np0005466012 podman[221956]: 2025-10-02 12:03:33.182411658 +0000 UTC m=+0.093096550 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Oct  2 08:03:33 np0005466012 nova_compute[192063]: 2025-10-02 12:03:33.611 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406598.6108584, 51680613-d2fe-44ac-8f89-432ec74a9541 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:33 np0005466012 nova_compute[192063]: 2025-10-02 12:03:33.612 2 INFO nova.compute.manager [-] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:03:33 np0005466012 nova_compute[192063]: 2025-10-02 12:03:33.645 2 DEBUG nova.compute.manager [None req-24b701ae-b6be-445b-9d53-0f8e89d87c19 - - - - - -] [instance: 51680613-d2fe-44ac-8f89-432ec74a9541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:34 np0005466012 nova_compute[192063]: 2025-10-02 12:03:34.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:34 np0005466012 nova_compute[192063]: 2025-10-02 12:03:34.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:36 np0005466012 podman[221981]: 2025-10-02 12:03:36.162351674 +0000 UTC m=+0.073739385 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64)
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.277 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.278 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.297 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.414 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.415 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.423 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.424 2 INFO nova.compute.claims [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.573 2 DEBUG nova.compute.provider_tree [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.602 2 DEBUG nova.scheduler.client.report [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.678 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.713 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "13668e2e-4ffc-4ddf-8b82-18278dcdd10d" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.713 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "13668e2e-4ffc-4ddf-8b82-18278dcdd10d" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.731 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] No node specified, defaulting to compute-1.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.797 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "13668e2e-4ffc-4ddf-8b82-18278dcdd10d" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.798 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.902 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.902 2 DEBUG nova.network.neutron [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.942 2 INFO nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:03:36 np0005466012 nova_compute[192063]: 2025-10-02 12:03:36.970 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.087 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.089 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.089 2 INFO nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Creating image(s)#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.090 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "/var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.091 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "/var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.092 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "/var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.114 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.190 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.191 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.192 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.202 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.264 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.265 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.299 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.301 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.302 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.360 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.362 2 DEBUG nova.virt.disk.api [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Checking if we can resize image /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.362 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.419 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.420 2 DEBUG nova.virt.disk.api [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Cannot resize image /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.420 2 DEBUG nova.objects.instance [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 17ad910d-1c3a-497f-bad5-326b4bcb0a23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.433 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.433 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Ensure instance console log exists: /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.434 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.434 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.434 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:03:37Z|00055|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.776 2 DEBUG nova.network.neutron [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.777 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.778 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.781 2 WARNING nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.786 2 DEBUG nova.virt.libvirt.host [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.786 2 DEBUG nova.virt.libvirt.host [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.789 2 DEBUG nova.virt.libvirt.host [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.790 2 DEBUG nova.virt.libvirt.host [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.791 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.791 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.792 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.792 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.792 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.792 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.793 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.793 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.793 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.793 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.794 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.794 2 DEBUG nova.virt.hardware [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.798 2 DEBUG nova.objects.instance [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17ad910d-1c3a-497f-bad5-326b4bcb0a23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.813 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <uuid>17ad910d-1c3a-497f-bad5-326b4bcb0a23</uuid>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <name>instance-00000016</name>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersOnMultiNodesTest-server-2114181845-2</nova:name>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:03:37</nova:creationTime>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:user uuid="d27eb44762f548fc96a3f2edcdb5537c">tempest-ServersOnMultiNodesTest-1227449327-project-member</nova:user>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:        <nova:project uuid="df2cf2fcc379455c90e6044b60e603c0">tempest-ServersOnMultiNodesTest-1227449327</nova:project>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <entry name="serial">17ad910d-1c3a-497f-bad5-326b4bcb0a23</entry>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <entry name="uuid">17ad910d-1c3a-497f-bad5-326b4bcb0a23</entry>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.config"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/console.log" append="off"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:03:37 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:03:37 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:03:37 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:03:37 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.881 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.881 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:03:37 np0005466012 nova_compute[192063]: 2025-10-02 12:03:37.882 2 INFO nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Using config drive#033[00m
Oct  2 08:03:37 np0005466012 podman[222020]: 2025-10-02 12:03:37.919437415 +0000 UTC m=+0.060288680 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:03:37 np0005466012 podman[222019]: 2025-10-02 12:03:37.951289839 +0000 UTC m=+0.095446512 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 08:03:38 np0005466012 nova_compute[192063]: 2025-10-02 12:03:38.101 2 INFO nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Creating config drive at /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.config#033[00m
Oct  2 08:03:38 np0005466012 nova_compute[192063]: 2025-10-02 12:03:38.105 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv6w4ru5_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:03:38 np0005466012 nova_compute[192063]: 2025-10-02 12:03:38.235 2 DEBUG oslo_concurrency.processutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv6w4ru5_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:03:38 np0005466012 systemd-machined[152114]: New machine qemu-11-instance-00000016.
Oct  2 08:03:38 np0005466012 systemd[1]: Started Virtual Machine qemu-11-instance-00000016.
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.017 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406619.0164895, 17ad910d-1c3a-497f-bad5-326b4bcb0a23 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.018 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.023 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.024 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.030 2 INFO nova.virt.libvirt.driver [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Instance spawned successfully.#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.030 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.046 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.054 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.059 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.060 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.061 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.061 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.062 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.062 2 DEBUG nova.virt.libvirt.driver [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.083 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.083 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406619.0170898, 17ad910d-1c3a-497f-bad5-326b4bcb0a23 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.083 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] VM Started (Lifecycle Event)#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.107 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.111 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.134 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.141 2 INFO nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Took 2.05 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.142 2 DEBUG nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.213 2 INFO nova.compute.manager [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Took 2.85 seconds to build instance.#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.229 2 DEBUG oslo_concurrency.lockutils [None req-01d1f9a5-2508-497a-aeba-5bb845fc3a18 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:39 np0005466012 nova_compute[192063]: 2025-10-02 12:03:39.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.036 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.037 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.038 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.038 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.039 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.051 2 INFO nova.compute.manager [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Terminating instance#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.067 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "refresh_cache-17ad910d-1c3a-497f-bad5-326b4bcb0a23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.067 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquired lock "refresh_cache-17ad910d-1c3a-497f-bad5-326b4bcb0a23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.067 2 DEBUG nova.network.neutron [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:43 np0005466012 nova_compute[192063]: 2025-10-02 12:03:43.277 2 DEBUG nova.network.neutron [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.022 2 DEBUG nova.network.neutron [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.041 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Releasing lock "refresh_cache-17ad910d-1c3a-497f-bad5-326b4bcb0a23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.042 2 DEBUG nova.compute.manager [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:03:44 np0005466012 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Deactivated successfully.
Oct  2 08:03:44 np0005466012 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Consumed 5.787s CPU time.
Oct  2 08:03:44 np0005466012 systemd-machined[152114]: Machine qemu-11-instance-00000016 terminated.
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.293 2 INFO nova.virt.libvirt.driver [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Instance destroyed successfully.#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.295 2 DEBUG nova.objects.instance [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'resources' on Instance uuid 17ad910d-1c3a-497f-bad5-326b4bcb0a23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.332 2 INFO nova.virt.libvirt.driver [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Deleting instance files /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23_del#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.333 2 INFO nova.virt.libvirt.driver [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Deletion of /var/lib/nova/instances/17ad910d-1c3a-497f-bad5-326b4bcb0a23_del complete#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.453 2 INFO nova.compute.manager [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.453 2 DEBUG oslo.service.loopingcall [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.454 2 DEBUG nova.compute.manager [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.454 2 DEBUG nova.network.neutron [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.603 2 DEBUG nova.network.neutron [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.616 2 DEBUG nova.network.neutron [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.632 2 INFO nova.compute.manager [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Took 0.18 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.751 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.751 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.815 2 DEBUG nova.compute.provider_tree [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.836 2 DEBUG nova.scheduler.client.report [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:44 np0005466012 nova_compute[192063]: 2025-10-02 12:03:44.859 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:45 np0005466012 nova_compute[192063]: 2025-10-02 12:03:45.053 2 INFO nova.scheduler.client.report [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Deleted allocations for instance 17ad910d-1c3a-497f-bad5-326b4bcb0a23#033[00m
Oct  2 08:03:45 np0005466012 nova_compute[192063]: 2025-10-02 12:03:45.225 2 DEBUG oslo_concurrency.lockutils [None req-ecd490c1-3d43-4822-b19e-488f38f3e3a8 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "17ad910d-1c3a-497f-bad5-326b4bcb0a23" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:49 np0005466012 nova_compute[192063]: 2025-10-02 12:03:49.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:49 np0005466012 nova_compute[192063]: 2025-10-02 12:03:49.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.250 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "99a100e5-2387-482b-91a2-55833b357c26" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.250 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "99a100e5-2387-482b-91a2-55833b357c26" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.250 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "99a100e5-2387-482b-91a2-55833b357c26-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.251 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "99a100e5-2387-482b-91a2-55833b357c26-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.251 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "99a100e5-2387-482b-91a2-55833b357c26-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.262 2 INFO nova.compute.manager [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Terminating instance#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.273 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "refresh_cache-99a100e5-2387-482b-91a2-55833b357c26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.274 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquired lock "refresh_cache-99a100e5-2387-482b-91a2-55833b357c26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.274 2 DEBUG nova.network.neutron [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.530 2 DEBUG nova.network.neutron [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.925 2 DEBUG nova.network.neutron [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.941 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Releasing lock "refresh_cache-99a100e5-2387-482b-91a2-55833b357c26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:03:50 np0005466012 nova_compute[192063]: 2025-10-02 12:03:50.942 2 DEBUG nova.compute.manager [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:03:50 np0005466012 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct  2 08:03:50 np0005466012 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000011.scope: Consumed 13.382s CPU time.
Oct  2 08:03:51 np0005466012 systemd-machined[152114]: Machine qemu-10-instance-00000011 terminated.
Oct  2 08:03:51 np0005466012 podman[222098]: 2025-10-02 12:03:51.105260634 +0000 UTC m=+0.118068811 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.192 2 INFO nova.virt.libvirt.driver [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Instance destroyed successfully.#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.192 2 DEBUG nova.objects.instance [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lazy-loading 'resources' on Instance uuid 99a100e5-2387-482b-91a2-55833b357c26 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.205 2 INFO nova.virt.libvirt.driver [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Deleting instance files /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26_del#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.207 2 INFO nova.virt.libvirt.driver [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Deletion of /var/lib/nova/instances/99a100e5-2387-482b-91a2-55833b357c26_del complete#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.267 2 INFO nova.compute.manager [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.267 2 DEBUG oslo.service.loopingcall [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.268 2 DEBUG nova.compute.manager [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.268 2 DEBUG nova.network.neutron [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.499 2 DEBUG nova.network.neutron [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.515 2 DEBUG nova.network.neutron [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.535 2 INFO nova.compute.manager [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Took 0.27 seconds to deallocate network for instance.#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.612 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.613 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.678 2 DEBUG nova.compute.provider_tree [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.694 2 DEBUG nova.scheduler.client.report [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.718 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.743 2 INFO nova.scheduler.client.report [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Deleted allocations for instance 99a100e5-2387-482b-91a2-55833b357c26#033[00m
Oct  2 08:03:51 np0005466012 nova_compute[192063]: 2025-10-02 12:03:51.823 2 DEBUG oslo_concurrency.lockutils [None req-4a4da9a6-e2c7-41e0-8b61-56afdaf1fe03 d27eb44762f548fc96a3f2edcdb5537c df2cf2fcc379455c90e6044b60e603c0 - - default default] Lock "99a100e5-2387-482b-91a2-55833b357c26" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:03:52 np0005466012 podman[222133]: 2025-10-02 12:03:52.168633514 +0000 UTC m=+0.075707578 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:03:53 np0005466012 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  2 08:03:54 np0005466012 nova_compute[192063]: 2025-10-02 12:03:54.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:54 np0005466012 nova_compute[192063]: 2025-10-02 12:03:54.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:55 np0005466012 podman[222158]: 2025-10-02 12:03:55.144380079 +0000 UTC m=+0.061814510 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:03:59 np0005466012 podman[222178]: 2025-10-02 12:03:59.191978889 +0000 UTC m=+0.092568975 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:03:59 np0005466012 nova_compute[192063]: 2025-10-02 12:03:59.292 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406624.2914, 17ad910d-1c3a-497f-bad5-326b4bcb0a23 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:03:59 np0005466012 nova_compute[192063]: 2025-10-02 12:03:59.293 2 INFO nova.compute.manager [-] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:03:59 np0005466012 nova_compute[192063]: 2025-10-02 12:03:59.313 2 DEBUG nova.compute.manager [None req-60abd4b1-a4ee-4cf1-820c-2ed0ee07495f - - - - - -] [instance: 17ad910d-1c3a-497f-bad5-326b4bcb0a23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:03:59 np0005466012 nova_compute[192063]: 2025-10-02 12:03:59.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:03:59 np0005466012 nova_compute[192063]: 2025-10-02 12:03:59.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:02.111 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:02.111 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:02.112 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:03.317 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:03 np0005466012 nova_compute[192063]: 2025-10-02 12:04:03.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:03.318 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:04:04 np0005466012 podman[222198]: 2025-10-02 12:04:04.15063662 +0000 UTC m=+0.064840800 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Oct  2 08:04:04 np0005466012 nova_compute[192063]: 2025-10-02 12:04:04.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:04 np0005466012 nova_compute[192063]: 2025-10-02 12:04:04.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:05.319 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:05 np0005466012 nova_compute[192063]: 2025-10-02 12:04:05.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:05 np0005466012 nova_compute[192063]: 2025-10-02 12:04:05.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:04:06 np0005466012 nova_compute[192063]: 2025-10-02 12:04:06.190 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406631.1892993, 99a100e5-2387-482b-91a2-55833b357c26 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:06 np0005466012 nova_compute[192063]: 2025-10-02 12:04:06.191 2 INFO nova.compute.manager [-] [instance: 99a100e5-2387-482b-91a2-55833b357c26] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:04:06 np0005466012 nova_compute[192063]: 2025-10-02 12:04:06.210 2 DEBUG nova.compute.manager [None req-aca1d1a2-08db-43be-9a74-93487c8d7cbd - - - - - -] [instance: 99a100e5-2387-482b-91a2-55833b357c26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:07 np0005466012 podman[222219]: 2025-10-02 12:04:07.123244933 +0000 UTC m=+0.045349673 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc.)
Oct  2 08:04:07 np0005466012 nova_compute[192063]: 2025-10-02 12:04:07.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:07 np0005466012 nova_compute[192063]: 2025-10-02 12:04:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:08 np0005466012 podman[222240]: 2025-10-02 12:04:08.151857682 +0000 UTC m=+0.070292656 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true)
Oct  2 08:04:08 np0005466012 podman[222241]: 2025-10-02 12:04:08.162622016 +0000 UTC m=+0.081494261 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:04:08 np0005466012 nova_compute[192063]: 2025-10-02 12:04:08.851 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:04:09 np0005466012 nova_compute[192063]: 2025-10-02 12:04:09.845 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:04:10 np0005466012 nova_compute[192063]: 2025-10-02 12:04:10.844 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:10 np0005466012 nova_compute[192063]: 2025-10-02 12:04:10.870 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:10 np0005466012 nova_compute[192063]: 2025-10-02 12:04:10.871 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:10 np0005466012 nova_compute[192063]: 2025-10-02 12:04:10.871 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:04:11 np0005466012 nova_compute[192063]: 2025-10-02 12:04:11.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:11 np0005466012 nova_compute[192063]: 2025-10-02 12:04:11.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:11 np0005466012 nova_compute[192063]: 2025-10-02 12:04:11.867 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:11 np0005466012 nova_compute[192063]: 2025-10-02 12:04:11.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:11 np0005466012 nova_compute[192063]: 2025-10-02 12:04:11.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:11 np0005466012 nova_compute[192063]: 2025-10-02 12:04:11.868 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.017 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.018 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5773MB free_disk=73.4662971496582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.018 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.019 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.075 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.075 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.094 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.193 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.249 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:04:12 np0005466012 nova_compute[192063]: 2025-10-02 12:04:12.250 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:14 np0005466012 nova_compute[192063]: 2025-10-02 12:04:14.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:14 np0005466012 nova_compute[192063]: 2025-10-02 12:04:14.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:15 np0005466012 nova_compute[192063]: 2025-10-02 12:04:15.250 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:15 np0005466012 nova_compute[192063]: 2025-10-02 12:04:15.251 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:04:15 np0005466012 nova_compute[192063]: 2025-10-02 12:04:15.251 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:04:15 np0005466012 nova_compute[192063]: 2025-10-02 12:04:15.272 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:04:19 np0005466012 nova_compute[192063]: 2025-10-02 12:04:19.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:19 np0005466012 nova_compute[192063]: 2025-10-02 12:04:19.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:22 np0005466012 podman[222285]: 2025-10-02 12:04:22.152314196 +0000 UTC m=+0.075913583 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.255 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "2f45c912-e983-4648-aad2-9053167e0891" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.256 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "2f45c912-e983-4648-aad2-9053167e0891" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.318 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.470 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.470 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.476 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.476 2 INFO nova.compute.claims [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.505 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.505 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.535 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.687 2 DEBUG nova.compute.provider_tree [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.695 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.710 2 DEBUG nova.scheduler.client.report [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.735 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.735 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.737 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.742 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.743 2 INFO nova.compute.claims [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.876 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.876 2 DEBUG nova.network.neutron [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.947 2 INFO nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:04:22 np0005466012 nova_compute[192063]: 2025-10-02 12:04:22.983 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.008 2 DEBUG nova.compute.provider_tree [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.030 2 DEBUG nova.scheduler.client.report [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.089 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.090 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:04:23 np0005466012 podman[222312]: 2025-10-02 12:04:23.125037443 +0000 UTC m=+0.042645902 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.220 2 DEBUG nova.network.neutron [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.221 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.416 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.417 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.418 2 INFO nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Creating image(s)#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.418 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "/var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.418 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "/var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.419 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "/var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.430 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.431 2 DEBUG nova.network.neutron [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.433 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.459 2 INFO nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.478 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.486 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.487 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.488 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.500 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.555 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.556 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.598 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.599 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.600 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.653 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.654 2 DEBUG nova.virt.disk.api [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Checking if we can resize image /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.654 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.710 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.711 2 DEBUG nova.virt.disk.api [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Cannot resize image /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.711 2 DEBUG nova.objects.instance [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f45c912-e983-4648-aad2-9053167e0891 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.734 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.735 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.735 2 INFO nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Creating image(s)#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.736 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "/var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.736 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "/var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.737 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "/var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.748 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.764 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.765 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Ensure instance console log exists: /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.765 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.766 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.766 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.767 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.771 2 WARNING nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.776 2 DEBUG nova.virt.libvirt.host [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.777 2 DEBUG nova.virt.libvirt.host [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.783 2 DEBUG nova.virt.libvirt.host [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.784 2 DEBUG nova.virt.libvirt.host [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.786 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.786 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.787 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.787 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.788 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.788 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.789 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.789 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.789 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.790 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.790 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.790 2 DEBUG nova.virt.hardware [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.799 2 DEBUG nova.objects.instance [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f45c912-e983-4648-aad2-9053167e0891 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.804 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.805 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.805 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.817 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.875 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <uuid>2f45c912-e983-4648-aad2-9053167e0891</uuid>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <name>instance-00000019</name>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerExternalEventsTest-server-1350258572</nova:name>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:04:23</nova:creationTime>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:user uuid="e2746697dd354be38ceccd1da320cc7a">tempest-ServerExternalEventsTest-1303720244-project-member</nova:user>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:        <nova:project uuid="81bd4f1fd64148c18d994cc6e37f5377">tempest-ServerExternalEventsTest-1303720244</nova:project>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <entry name="serial">2f45c912-e983-4648-aad2-9053167e0891</entry>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <entry name="uuid">2f45c912-e983-4648-aad2-9053167e0891</entry>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.config"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/console.log" append="off"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:04:23 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:04:23 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:04:23 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:04:23 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.877 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.878 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.914 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.914 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.915 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.960 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.960 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.961 2 INFO nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Using config drive#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.969 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.970 2 DEBUG nova.virt.disk.api [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Checking if we can resize image /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:04:23 np0005466012 nova_compute[192063]: 2025-10-02 12:04:23.970 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.025 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.026 2 DEBUG nova.virt.disk.api [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Cannot resize image /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.026 2 DEBUG nova.objects.instance [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lazy-loading 'migration_context' on Instance uuid 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.067 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.067 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Ensure instance console log exists: /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.067 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.068 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.068 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.146 2 DEBUG nova.policy [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '73748ced15f1405a948a15e278017df0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd2012bd8680647e7aadc4522d79320e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.154 2 INFO nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Creating config drive at /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.config#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.159 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphivr5zq_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.281 2 DEBUG oslo_concurrency.processutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphivr5zq_" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:24 np0005466012 systemd-machined[152114]: New machine qemu-12-instance-00000019.
Oct  2 08:04:24 np0005466012 systemd[1]: Started Virtual Machine qemu-12-instance-00000019.
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:24 np0005466012 nova_compute[192063]: 2025-10-02 12:04:24.731 2 DEBUG nova.network.neutron [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Successfully created port: 4b0d9367-8a78-4efb-8d87-7e7e422482a1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.465 2 DEBUG nova.network.neutron [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Successfully updated port: 4b0d9367-8a78-4efb-8d87-7e7e422482a1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.484 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "refresh_cache-6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.484 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquired lock "refresh_cache-6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.484 2 DEBUG nova.network.neutron [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.525 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406665.5249326, 2f45c912-e983-4648-aad2-9053167e0891 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.525 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.528 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.528 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.531 2 INFO nova.virt.libvirt.driver [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance spawned successfully.#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.531 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.573 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.579 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.580 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.580 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.581 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.581 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.581 2 DEBUG nova.virt.libvirt.driver [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.586 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.632 2 DEBUG nova.compute.manager [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-changed-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.633 2 DEBUG nova.compute.manager [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Refreshing instance network info cache due to event network-changed-4b0d9367-8a78-4efb-8d87-7e7e422482a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.633 2 DEBUG oslo_concurrency.lockutils [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.646 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.646 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406665.5279744, 2f45c912-e983-4648-aad2-9053167e0891 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.647 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.680 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.683 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.706 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.728 2 INFO nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Took 2.31 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.729 2 DEBUG nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.795 2 INFO nova.compute.manager [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Took 3.39 seconds to build instance.#033[00m
Oct  2 08:04:25 np0005466012 nova_compute[192063]: 2025-10-02 12:04:25.812 2 DEBUG oslo_concurrency.lockutils [None req-0d3c365c-86a9-4d07-801c-f1c12f91140d e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "2f45c912-e983-4648-aad2-9053167e0891" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:26 np0005466012 podman[222394]: 2025-10-02 12:04:26.137935579 +0000 UTC m=+0.052710581 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:04:26 np0005466012 nova_compute[192063]: 2025-10-02 12:04:26.280 2 DEBUG nova.network.neutron [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.055 2 DEBUG nova.network.neutron [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Updating instance_info_cache with network_info: [{"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.079 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Releasing lock "refresh_cache-6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.080 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Instance network_info: |[{"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.080 2 DEBUG oslo_concurrency.lockutils [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.080 2 DEBUG nova.network.neutron [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Refreshing network info cache for port 4b0d9367-8a78-4efb-8d87-7e7e422482a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.083 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Start _get_guest_xml network_info=[{"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.086 2 WARNING nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.091 2 DEBUG nova.virt.libvirt.host [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.092 2 DEBUG nova.virt.libvirt.host [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.095 2 DEBUG nova.virt.libvirt.host [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.095 2 DEBUG nova.virt.libvirt.host [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.096 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.097 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.097 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.097 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.097 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.098 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.098 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.098 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.098 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.098 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.099 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.099 2 DEBUG nova.virt.hardware [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.102 2 DEBUG nova.virt.libvirt.vif [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-628395693',display_name='tempest-ImagesNegativeTestJSON-server-628395693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-628395693',id=26,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2012bd8680647e7aadc4522d79320e8',ramdisk_id='',reservation_id='r-azkqqxle',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-2051634032',owner_user_name='tempest-ImagesNegativeTestJSON-2051634032-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:23Z,user_data=None,user_id='73748ced15f1405a948a15e278017df0',uuid=6e6bfda6-0d7c-4526-a660-1cd0f7360e4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.102 2 DEBUG nova.network.os_vif_util [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Converting VIF {"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.103 2 DEBUG nova.network.os_vif_util [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.104 2 DEBUG nova.objects.instance [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.121 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <uuid>6e6bfda6-0d7c-4526-a660-1cd0f7360e4d</uuid>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <name>instance-0000001a</name>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:name>tempest-ImagesNegativeTestJSON-server-628395693</nova:name>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:04:27</nova:creationTime>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:user uuid="73748ced15f1405a948a15e278017df0">tempest-ImagesNegativeTestJSON-2051634032-project-member</nova:user>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:project uuid="d2012bd8680647e7aadc4522d79320e8">tempest-ImagesNegativeTestJSON-2051634032</nova:project>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        <nova:port uuid="4b0d9367-8a78-4efb-8d87-7e7e422482a1">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <entry name="serial">6e6bfda6-0d7c-4526-a660-1cd0f7360e4d</entry>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <entry name="uuid">6e6bfda6-0d7c-4526-a660-1cd0f7360e4d</entry>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.config"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:f2:c2:db"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <target dev="tap4b0d9367-8a"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/console.log" append="off"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:04:27 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:04:27 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:04:27 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:04:27 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.122 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Preparing to wait for external event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.123 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.123 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.123 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.124 2 DEBUG nova.virt.libvirt.vif [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-628395693',display_name='tempest-ImagesNegativeTestJSON-server-628395693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-628395693',id=26,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2012bd8680647e7aadc4522d79320e8',ramdisk_id='',reservation_id='r-azkqqxle',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-2051634032',owner_user_name='tempest-ImagesNegativeTestJSON-2051634032-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:23Z,user_data=None,user_id='73748ced15f1405a948a15e278017df0',uuid=6e6bfda6-0d7c-4526-a660-1cd0f7360e4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.124 2 DEBUG nova.network.os_vif_util [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Converting VIF {"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.124 2 DEBUG nova.network.os_vif_util [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.125 2 DEBUG os_vif [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.126 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.126 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.128 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b0d9367-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.129 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b0d9367-8a, col_values=(('external_ids', {'iface-id': '4b0d9367-8a78-4efb-8d87-7e7e422482a1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:c2:db', 'vm-uuid': '6e6bfda6-0d7c-4526-a660-1cd0f7360e4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:27 np0005466012 NetworkManager[51207]: <info>  [1759406667.1312] manager: (tap4b0d9367-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.133 2 DEBUG nova.compute.manager [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.134 2 DEBUG nova.compute.manager [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.134 2 DEBUG oslo_concurrency.lockutils [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] Acquiring lock "refresh_cache-2f45c912-e983-4648-aad2-9053167e0891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.134 2 DEBUG oslo_concurrency.lockutils [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] Acquired lock "refresh_cache-2f45c912-e983-4648-aad2-9053167e0891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.134 2 DEBUG nova.network.neutron [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.138 2 INFO os_vif [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a')#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.215 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.216 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.216 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] No VIF found with MAC fa:16:3e:f2:c2:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.217 2 INFO nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Using config drive#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.439 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "2f45c912-e983-4648-aad2-9053167e0891" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.440 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "2f45c912-e983-4648-aad2-9053167e0891" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.440 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "2f45c912-e983-4648-aad2-9053167e0891-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.441 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "2f45c912-e983-4648-aad2-9053167e0891-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.441 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "2f45c912-e983-4648-aad2-9053167e0891-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.454 2 INFO nova.compute.manager [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Terminating instance#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.466 2 DEBUG nova.network.neutron [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.468 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "refresh_cache-2f45c912-e983-4648-aad2-9053167e0891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.660 2 INFO nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Creating config drive at /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.config#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.665 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprwr8qzii execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.801 2 DEBUG oslo_concurrency.processutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprwr8qzii" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:27 np0005466012 kernel: tap4b0d9367-8a: entered promiscuous mode
Oct  2 08:04:27 np0005466012 systemd-udevd[222392]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:27 np0005466012 NetworkManager[51207]: <info>  [1759406667.8511] manager: (tap4b0d9367-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:27Z|00056|binding|INFO|Claiming lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 for this chassis.
Oct  2 08:04:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:27Z|00057|binding|INFO|4b0d9367-8a78-4efb-8d87-7e7e422482a1: Claiming fa:16:3e:f2:c2:db 10.100.0.6
Oct  2 08:04:27 np0005466012 NetworkManager[51207]: <info>  [1759406667.8636] device (tap4b0d9367-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:04:27 np0005466012 NetworkManager[51207]: <info>  [1759406667.8647] device (tap4b0d9367-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.872 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:c2:db 10.100.0.6'], port_security=['fa:16:3e:f2:c2:db 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6e6bfda6-0d7c-4526-a660-1cd0f7360e4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05984f69-a29b-4ae1-86b5-956a2285c76f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2012bd8680647e7aadc4522d79320e8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '248510db-59cd-4f03-a11e-7eb7d6ba0631', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35ca25f9-4413-43e3-8adf-6ad1addcc5e4, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4b0d9367-8a78-4efb-8d87-7e7e422482a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.873 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4b0d9367-8a78-4efb-8d87-7e7e422482a1 in datapath 05984f69-a29b-4ae1-86b5-956a2285c76f bound to our chassis#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.874 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05984f69-a29b-4ae1-86b5-956a2285c76f#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.886 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bd498e35-b38c-44f1-9a66-21f2edf3ae30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.887 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap05984f69-a1 in ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.888 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap05984f69-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.888 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c71ac3b1-a7d5-44eb-8745-5cb792d97e2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.889 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[679c27e2-ff1b-44a0-b696-f9e077650e39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 systemd-machined[152114]: New machine qemu-13-instance-0000001a.
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.900 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[83722e37-e035-4c86-b5a0-1bbb79c72d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.901 2 DEBUG nova.network.neutron [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:27Z|00058|binding|INFO|Setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 ovn-installed in OVS
Oct  2 08:04:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:27Z|00059|binding|INFO|Setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 up in Southbound
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.917 2 DEBUG oslo_concurrency.lockutils [None req-f87ca83d-1434-4049-adfa-f33e5d563c3c 0dcd7cbb94b24a3db735224ac386fe9f cb6cbb8ebfd24293a479e872f08695ec - - default default] Releasing lock "refresh_cache-2f45c912-e983-4648-aad2-9053167e0891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.920 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquired lock "refresh_cache-2f45c912-e983-4648-aad2-9053167e0891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:27 np0005466012 nova_compute[192063]: 2025-10-02 12:04:27.921 2 DEBUG nova.network.neutron [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:27 np0005466012 systemd[1]: Started Virtual Machine qemu-13-instance-0000001a.
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.925 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[55f40933-7404-4f48-86d0-2a74a783577b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.952 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[570d8a98-8eab-4126-8399-6bbadc3ba724]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.958 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9b099a00-8252-4441-b135-8c328da31588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 NetworkManager[51207]: <info>  [1759406667.9588] manager: (tap05984f69-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.992 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ab1d30-8c96-4815-87e2-81914cac0506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:27.994 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9955f17f-8871-4a1f-90c1-a90dc11629a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 NetworkManager[51207]: <info>  [1759406668.0167] device (tap05984f69-a0): carrier: link connected
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.023 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4decf470-8415-47ad-a83a-f0c8f2ba3f88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.048 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[111f97ba-7679-4f15-a6f6-0c02c5e394b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05984f69-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:cf:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466162, 'reachable_time': 35425, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222465, 'error': None, 'target': 'ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.077 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4be70a5a-8cc9-4162-8724-38c3ec6d9816]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:cfaf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466162, 'tstamp': 466162}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222466, 'error': None, 'target': 'ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.110 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f07bdd6a-5619-4688-a4e7-bc5a2d90286f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05984f69-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:cf:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466162, 'reachable_time': 35425, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222467, 'error': None, 'target': 'ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.155 2 DEBUG nova.network.neutron [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.157 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[03a30186-b393-4b43-aa1b-90064bf2a9a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.194 2 DEBUG nova.compute.manager [req-0a3734ef-eb99-4ba5-a0ea-5b92ad763254 req-4216e4a1-e721-4f86-ad29-ce220723c117 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.204 2 DEBUG oslo_concurrency.lockutils [req-0a3734ef-eb99-4ba5-a0ea-5b92ad763254 req-4216e4a1-e721-4f86-ad29-ce220723c117 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.205 2 DEBUG oslo_concurrency.lockutils [req-0a3734ef-eb99-4ba5-a0ea-5b92ad763254 req-4216e4a1-e721-4f86-ad29-ce220723c117 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.205 2 DEBUG oslo_concurrency.lockutils [req-0a3734ef-eb99-4ba5-a0ea-5b92ad763254 req-4216e4a1-e721-4f86-ad29-ce220723c117 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.205 2 DEBUG nova.compute.manager [req-0a3734ef-eb99-4ba5-a0ea-5b92ad763254 req-4216e4a1-e721-4f86-ad29-ce220723c117 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Processing event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.243 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4d0ac6-a8aa-430f-a314-d9138e36c29a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.245 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05984f69-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.245 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.245 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05984f69-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:28 np0005466012 NetworkManager[51207]: <info>  [1759406668.2480] manager: (tap05984f69-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct  2 08:04:28 np0005466012 kernel: tap05984f69-a0: entered promiscuous mode
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.250 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05984f69-a0, col_values=(('external_ids', {'iface-id': '17cf4ced-dd06-4cd3-a025-79fddaf8b996'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:28Z|00060|binding|INFO|Releasing lport 17cf4ced-dd06-4cd3-a025-79fddaf8b996 from this chassis (sb_readonly=0)
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.271 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/05984f69-a29b-4ae1-86b5-956a2285c76f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/05984f69-a29b-4ae1-86b5-956a2285c76f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.272 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c391dc24-7a43-4128-bbf1-ae3b2b902674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.273 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-05984f69-a29b-4ae1-86b5-956a2285c76f
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/05984f69-a29b-4ae1-86b5-956a2285c76f.pid.haproxy
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 05984f69-a29b-4ae1-86b5-956a2285c76f
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:04:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:28.275 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f', 'env', 'PROCESS_TAG=haproxy-05984f69-a29b-4ae1-86b5-956a2285c76f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/05984f69-a29b-4ae1-86b5-956a2285c76f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.397 2 DEBUG nova.network.neutron [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.415 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Releasing lock "refresh_cache-2f45c912-e983-4648-aad2-9053167e0891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.415 2 DEBUG nova.compute.manager [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:04:28 np0005466012 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct  2 08:04:28 np0005466012 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000019.scope: Consumed 4.053s CPU time.
Oct  2 08:04:28 np0005466012 systemd-machined[152114]: Machine qemu-12-instance-00000019 terminated.
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.655 2 DEBUG nova.network.neutron [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Updated VIF entry in instance network info cache for port 4b0d9367-8a78-4efb-8d87-7e7e422482a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.655 2 DEBUG nova.network.neutron [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Updating instance_info_cache with network_info: [{"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.659 2 INFO nova.virt.libvirt.driver [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance destroyed successfully.#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.659 2 DEBUG nova.objects.instance [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lazy-loading 'resources' on Instance uuid 2f45c912-e983-4648-aad2-9053167e0891 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.671 2 DEBUG oslo_concurrency.lockutils [req-c069dfb6-9158-486d-9fd7-4e5fec486c52 req-93e014d0-d981-4bbf-bfe6-85b6a5070726 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.676 2 INFO nova.virt.libvirt.driver [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Deleting instance files /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891_del#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.678 2 INFO nova.virt.libvirt.driver [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Deletion of /var/lib/nova/instances/2f45c912-e983-4648-aad2-9053167e0891_del complete#033[00m
Oct  2 08:04:28 np0005466012 podman[222507]: 2025-10-02 12:04:28.594832738 +0000 UTC m=+0.019548604 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.714 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406668.7144914, 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.715 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.717 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.722 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.726 2 INFO nova.virt.libvirt.driver [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Instance spawned successfully.#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.726 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:04:28 np0005466012 podman[222507]: 2025-10-02 12:04:28.730616389 +0000 UTC m=+0.155332225 container create 2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.745 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.750 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.755 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.755 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.756 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.756 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.757 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.757 2 DEBUG nova.virt.libvirt.driver [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.761 2 INFO nova.compute.manager [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.761 2 DEBUG oslo.service.loopingcall [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.761 2 DEBUG nova.compute.manager [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.762 2 DEBUG nova.network.neutron [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.768 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.768 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406668.7145717, 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.768 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:04:28 np0005466012 systemd[1]: Started libpod-conmon-2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8.scope.
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.789 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.792 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406668.7190766, 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.792 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:28 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:04:28 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6b8d57b24a952dea4ecdaaca24dfe471646759edd44d8790f25c388fd6d5112/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.813 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.816 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.837 2 INFO nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Took 5.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.837 2 DEBUG nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.845 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:28 np0005466012 podman[222507]: 2025-10-02 12:04:28.84808581 +0000 UTC m=+0.272801656 container init 2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:04:28 np0005466012 podman[222507]: 2025-10-02 12:04:28.853800693 +0000 UTC m=+0.278516519 container start 2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:04:28 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [NOTICE]   (222534) : New worker (222536) forked
Oct  2 08:04:28 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [NOTICE]   (222534) : Loading success.
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.949 2 INFO nova.compute.manager [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Took 6.31 seconds to build instance.#033[00m
Oct  2 08:04:28 np0005466012 nova_compute[192063]: 2025-10-02 12:04:28.983 2 DEBUG oslo_concurrency.lockutils [None req-542955cd-41c1-4999-a8be-7babe76dc64a 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.478s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.257 2 DEBUG nova.network.neutron [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.286 2 DEBUG nova.network.neutron [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.306 2 INFO nova.compute.manager [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Took 0.54 seconds to deallocate network for instance.#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.410 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.411 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.471 2 DEBUG nova.compute.provider_tree [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.486 2 DEBUG nova.scheduler.client.report [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.522 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.559 2 INFO nova.scheduler.client.report [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Deleted allocations for instance 2f45c912-e983-4648-aad2-9053167e0891#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.629 2 DEBUG oslo_concurrency.lockutils [None req-8190661a-062a-44a2-b355-d233ceaae1a0 e2746697dd354be38ceccd1da320cc7a 81bd4f1fd64148c18d994cc6e37f5377 - - default default] Lock "2f45c912-e983-4648-aad2-9053167e0891" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.883 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.884 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.884 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.884 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.884 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.895 2 INFO nova.compute.manager [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Terminating instance#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.909 2 DEBUG nova.compute.manager [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:04:29 np0005466012 kernel: tap4b0d9367-8a (unregistering): left promiscuous mode
Oct  2 08:04:29 np0005466012 NetworkManager[51207]: <info>  [1759406669.9238] device (tap4b0d9367-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:29Z|00061|binding|INFO|Releasing lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 from this chassis (sb_readonly=0)
Oct  2 08:04:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:29Z|00062|binding|INFO|Setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 down in Southbound
Oct  2 08:04:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:29Z|00063|binding|INFO|Removing iface tap4b0d9367-8a ovn-installed in OVS
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:29.946 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:c2:db 10.100.0.6'], port_security=['fa:16:3e:f2:c2:db 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6e6bfda6-0d7c-4526-a660-1cd0f7360e4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05984f69-a29b-4ae1-86b5-956a2285c76f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2012bd8680647e7aadc4522d79320e8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '248510db-59cd-4f03-a11e-7eb7d6ba0631', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35ca25f9-4413-43e3-8adf-6ad1addcc5e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4b0d9367-8a78-4efb-8d87-7e7e422482a1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:29.947 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4b0d9367-8a78-4efb-8d87-7e7e422482a1 in datapath 05984f69-a29b-4ae1-86b5-956a2285c76f unbound from our chassis#033[00m
Oct  2 08:04:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:29.949 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05984f69-a29b-4ae1-86b5-956a2285c76f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:04:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:29.950 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f4279407-8f4a-49c8-9f9d-9e08e3f0e8c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:29.951 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f namespace which is not needed anymore#033[00m
Oct  2 08:04:29 np0005466012 nova_compute[192063]: 2025-10-02 12:04:29.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:29 np0005466012 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Oct  2 08:04:29 np0005466012 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Consumed 1.875s CPU time.
Oct  2 08:04:29 np0005466012 systemd-machined[152114]: Machine qemu-13-instance-0000001a terminated.
Oct  2 08:04:30 np0005466012 podman[222546]: 2025-10-02 12:04:30.015486472 +0000 UTC m=+0.056694057 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:30 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [NOTICE]   (222534) : haproxy version is 2.8.14-c23fe91
Oct  2 08:04:30 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [NOTICE]   (222534) : path to executable is /usr/sbin/haproxy
Oct  2 08:04:30 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [WARNING]  (222534) : Exiting Master process...
Oct  2 08:04:30 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [WARNING]  (222534) : Exiting Master process...
Oct  2 08:04:30 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [ALERT]    (222534) : Current worker (222536) exited with code 143 (Terminated)
Oct  2 08:04:30 np0005466012 neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f[222530]: [WARNING]  (222534) : All workers exited. Exiting... (0)
Oct  2 08:04:30 np0005466012 systemd[1]: libpod-2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8.scope: Deactivated successfully.
Oct  2 08:04:30 np0005466012 podman[222586]: 2025-10-02 12:04:30.075254741 +0000 UTC m=+0.044831230 container died 2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:04:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8-userdata-shm.mount: Deactivated successfully.
Oct  2 08:04:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-c6b8d57b24a952dea4ecdaaca24dfe471646759edd44d8790f25c388fd6d5112-merged.mount: Deactivated successfully.
Oct  2 08:04:30 np0005466012 podman[222586]: 2025-10-02 12:04:30.113991907 +0000 UTC m=+0.083568396 container cleanup 2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:04:30 np0005466012 systemd[1]: libpod-conmon-2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8.scope: Deactivated successfully.
Oct  2 08:04:30 np0005466012 kernel: tap4b0d9367-8a: entered promiscuous mode
Oct  2 08:04:30 np0005466012 NetworkManager[51207]: <info>  [1759406670.1255] manager: (tap4b0d9367-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Oct  2 08:04:30 np0005466012 systemd-udevd[222545]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00064|binding|INFO|Claiming lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 for this chassis.
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00065|binding|INFO|4b0d9367-8a78-4efb-8d87-7e7e422482a1: Claiming fa:16:3e:f2:c2:db 10.100.0.6
Oct  2 08:04:30 np0005466012 kernel: tap4b0d9367-8a (unregistering): left promiscuous mode
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.135 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:c2:db 10.100.0.6'], port_security=['fa:16:3e:f2:c2:db 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6e6bfda6-0d7c-4526-a660-1cd0f7360e4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05984f69-a29b-4ae1-86b5-956a2285c76f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2012bd8680647e7aadc4522d79320e8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '248510db-59cd-4f03-a11e-7eb7d6ba0631', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35ca25f9-4413-43e3-8adf-6ad1addcc5e4, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4b0d9367-8a78-4efb-8d87-7e7e422482a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00066|binding|INFO|Setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 ovn-installed in OVS
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00067|binding|INFO|Setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 up in Southbound
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00068|binding|INFO|Releasing lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 from this chassis (sb_readonly=1)
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00069|if_status|INFO|Not setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 down as sb is readonly
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00070|binding|INFO|Removing iface tap4b0d9367-8a ovn-installed in OVS
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00071|binding|INFO|Releasing lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 from this chassis (sb_readonly=0)
Oct  2 08:04:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:30Z|00072|binding|INFO|Setting lport 4b0d9367-8a78-4efb-8d87-7e7e422482a1 down in Southbound
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.165 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:c2:db 10.100.0.6'], port_security=['fa:16:3e:f2:c2:db 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6e6bfda6-0d7c-4526-a660-1cd0f7360e4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05984f69-a29b-4ae1-86b5-956a2285c76f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2012bd8680647e7aadc4522d79320e8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '248510db-59cd-4f03-a11e-7eb7d6ba0631', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35ca25f9-4413-43e3-8adf-6ad1addcc5e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4b0d9367-8a78-4efb-8d87-7e7e422482a1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.177 2 INFO nova.virt.libvirt.driver [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Instance destroyed successfully.#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.178 2 DEBUG nova.objects.instance [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lazy-loading 'resources' on Instance uuid 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:30 np0005466012 podman[222618]: 2025-10-02 12:04:30.181734908 +0000 UTC m=+0.042530808 container remove 2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.186 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[60d6f111-5353-40c2-85ae-2bdfffdadb23]: (4, ('Thu Oct  2 12:04:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f (2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8)\n2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8\nThu Oct  2 12:04:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f (2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8)\n2a16f9ac605c67c81c9e6df3e5c8309cc237dbcde5620c49776842aa981d2ec8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.187 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd9554f-4534-483e-9ba0-6accc52545c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.188 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05984f69-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 kernel: tap05984f69-a0: left promiscuous mode
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.192 2 DEBUG nova.virt.libvirt.vif [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-628395693',display_name='tempest-ImagesNegativeTestJSON-server-628395693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-628395693',id=26,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:04:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d2012bd8680647e7aadc4522d79320e8',ramdisk_id='',reservation_id='r-azkqqxle',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-2051634032',owner_user_name='tempest-ImagesNegativeTestJSON-2051634032-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:04:28Z,user_data=None,user_id='73748ced15f1405a948a15e278017df0',uuid=6e6bfda6-0d7c-4526-a660-1cd0f7360e4d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.192 2 DEBUG nova.network.os_vif_util [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Converting VIF {"id": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "address": "fa:16:3e:f2:c2:db", "network": {"id": "05984f69-a29b-4ae1-86b5-956a2285c76f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1023280971-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2012bd8680647e7aadc4522d79320e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0d9367-8a", "ovs_interfaceid": "4b0d9367-8a78-4efb-8d87-7e7e422482a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.193 2 DEBUG nova.network.os_vif_util [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.198 2 DEBUG os_vif [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b0d9367-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.205 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4e41a65e-dc5b-4dbb-b92a-4b1ef0f68a56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.205 2 INFO os_vif [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:c2:db,bridge_name='br-int',has_traffic_filtering=True,id=4b0d9367-8a78-4efb-8d87-7e7e422482a1,network=Network(05984f69-a29b-4ae1-86b5-956a2285c76f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0d9367-8a')#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.206 2 INFO nova.virt.libvirt.driver [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Deleting instance files /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d_del#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.206 2 INFO nova.virt.libvirt.driver [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Deletion of /var/lib/nova/instances/6e6bfda6-0d7c-4526-a660-1cd0f7360e4d_del complete#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.236 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[202e14b5-357e-427b-96f0-6ec1982e817d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.238 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2546d403-4983-40c8-a1c3-f2ca72aed891]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.250 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[48e0f547-d1cf-4e0c-991e-128afbf33a07]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466155, 'reachable_time': 31843, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222643, 'error': None, 'target': 'ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 systemd[1]: run-netns-ovnmeta\x2d05984f69\x2da29b\x2d4ae1\x2d86b5\x2d956a2285c76f.mount: Deactivated successfully.
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.253 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-05984f69-a29b-4ae1-86b5-956a2285c76f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.253 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[65dc90b9-e883-4d54-9898-e7ce45849163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.255 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4b0d9367-8a78-4efb-8d87-7e7e422482a1 in datapath 05984f69-a29b-4ae1-86b5-956a2285c76f unbound from our chassis#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.256 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05984f69-a29b-4ae1-86b5-956a2285c76f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.257 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a7831587-8b1f-43f4-afbd-b6d213fa5b9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.257 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4b0d9367-8a78-4efb-8d87-7e7e422482a1 in datapath 05984f69-a29b-4ae1-86b5-956a2285c76f unbound from our chassis#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.259 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05984f69-a29b-4ae1-86b5-956a2285c76f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:04:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:30.259 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[42aad312-cae4-4cb1-889a-e7e8a087519e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.294 2 INFO nova.compute.manager [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.295 2 DEBUG oslo.service.loopingcall [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.295 2 DEBUG nova.compute.manager [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.295 2 DEBUG nova.network.neutron [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.370 2 DEBUG nova.compute.manager [req-eb9f4112-b18e-4748-a7a0-7b6c9610229e req-1ecd0ac4-2849-4fe4-97a3-e308838791de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.371 2 DEBUG oslo_concurrency.lockutils [req-eb9f4112-b18e-4748-a7a0-7b6c9610229e req-1ecd0ac4-2849-4fe4-97a3-e308838791de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.371 2 DEBUG oslo_concurrency.lockutils [req-eb9f4112-b18e-4748-a7a0-7b6c9610229e req-1ecd0ac4-2849-4fe4-97a3-e308838791de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.372 2 DEBUG oslo_concurrency.lockutils [req-eb9f4112-b18e-4748-a7a0-7b6c9610229e req-1ecd0ac4-2849-4fe4-97a3-e308838791de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.372 2 DEBUG nova.compute.manager [req-eb9f4112-b18e-4748-a7a0-7b6c9610229e req-1ecd0ac4-2849-4fe4-97a3-e308838791de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] No waiting events found dispatching network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:30 np0005466012 nova_compute[192063]: 2025-10-02 12:04:30.373 2 WARNING nova.compute.manager [req-eb9f4112-b18e-4748-a7a0-7b6c9610229e req-1ecd0ac4-2849-4fe4-97a3-e308838791de 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received unexpected event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.368 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.392 2 WARNING nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.393 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Triggering sync for uuid 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.393 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.695 2 DEBUG nova.network.neutron [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.716 2 INFO nova.compute.manager [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Took 1.42 seconds to deallocate network for instance.#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.811 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.811 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.882 2 DEBUG nova.compute.provider_tree [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.897 2 DEBUG nova.scheduler.client.report [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.921 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:31 np0005466012 nova_compute[192063]: 2025-10-02 12:04:31.944 2 INFO nova.scheduler.client.report [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Deleted allocations for instance 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.014 2 DEBUG oslo_concurrency.lockutils [None req-e3b57f31-2925-4b3e-8483-476c33e2d35d 73748ced15f1405a948a15e278017df0 d2012bd8680647e7aadc4522d79320e8 - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.015 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.015 2 INFO nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.015 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.501 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-vif-unplugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.501 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.501 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.502 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.502 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] No waiting events found dispatching network-vif-unplugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.502 2 WARNING nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received unexpected event network-vif-unplugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.502 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.502 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.503 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.503 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.503 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] No waiting events found dispatching network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.503 2 WARNING nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received unexpected event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.503 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.504 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.504 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.504 2 DEBUG oslo_concurrency.lockutils [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e6bfda6-0d7c-4526-a660-1cd0f7360e4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.504 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] No waiting events found dispatching network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.504 2 WARNING nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received unexpected event network-vif-plugged-4b0d9367-8a78-4efb-8d87-7e7e422482a1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.504 2 DEBUG nova.compute.manager [req-fd0d9180-adfe-4bc6-b9d5-fe260da8c6d2 req-7f0d25aa-bb01-4bf8-b7b3-2ebca50d1e7d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Received event network-vif-deleted-4b0d9367-8a78-4efb-8d87-7e7e422482a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.550 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.550 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.567 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.731 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.731 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.739 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.739 2 INFO nova.compute.claims [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.880 2 DEBUG nova.compute.provider_tree [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.891 2 DEBUG nova.scheduler.client.report [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.911 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.912 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.976 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:04:32 np0005466012 nova_compute[192063]: 2025-10-02 12:04:32.976 2 DEBUG nova.network.neutron [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.000 2 INFO nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.017 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.132 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.133 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.134 2 INFO nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Creating image(s)#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.134 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "/var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.134 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "/var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.135 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "/var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.146 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.217 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.219 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.219 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.233 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.289 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.290 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.321 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.322 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.322 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.375 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.376 2 DEBUG nova.virt.disk.api [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Checking if we can resize image /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.376 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.429 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.430 2 DEBUG nova.virt.disk.api [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Cannot resize image /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.430 2 DEBUG nova.objects.instance [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lazy-loading 'migration_context' on Instance uuid a6bb5263-b0c7-4282-8e02-3503fd778e6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.448 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.448 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Ensure instance console log exists: /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.448 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.449 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.449 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:33 np0005466012 nova_compute[192063]: 2025-10-02 12:04:33.508 2 DEBUG nova.policy [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5f75195e56504673bd403ce69cbc28ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.712 2 DEBUG nova.network.neutron [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Successfully updated port: 5e772b33-6577-4ba1-b187-e4779ef49ed6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.737 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.737 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquired lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.738 2 DEBUG nova.network.neutron [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.792 2 DEBUG nova.compute.manager [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-changed-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.793 2 DEBUG nova.compute.manager [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Refreshing instance network info cache due to event network-changed-5e772b33-6577-4ba1-b187-e4779ef49ed6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.793 2 DEBUG oslo_concurrency.lockutils [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:34 np0005466012 nova_compute[192063]: 2025-10-02 12:04:34.897 2 DEBUG nova.network.neutron [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:04:35 np0005466012 podman[222659]: 2025-10-02 12:04:35.129415973 +0000 UTC m=+0.051480909 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.776 2 DEBUG nova.network.neutron [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Updating instance_info_cache with network_info: [{"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.803 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Releasing lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.804 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Instance network_info: |[{"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.804 2 DEBUG oslo_concurrency.lockutils [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.805 2 DEBUG nova.network.neutron [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Refreshing network info cache for port 5e772b33-6577-4ba1-b187-e4779ef49ed6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.808 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Start _get_guest_xml network_info=[{"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.813 2 WARNING nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.817 2 DEBUG nova.virt.libvirt.host [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.818 2 DEBUG nova.virt.libvirt.host [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.820 2 DEBUG nova.virt.libvirt.host [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.821 2 DEBUG nova.virt.libvirt.host [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.822 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.822 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.823 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.823 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.823 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.823 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.824 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.824 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.824 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.824 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.825 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.825 2 DEBUG nova.virt.hardware [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.828 2 DEBUG nova.virt.libvirt.vif [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1169459074',display_name='tempest-LiveMigrationTest-server-1169459074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1169459074',id=28,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-b554y970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:33Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=a6bb5263-b0c7-4282-8e02-3503fd778e6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.828 2 DEBUG nova.network.os_vif_util [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Converting VIF {"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.829 2 DEBUG nova.network.os_vif_util [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.830 2 DEBUG nova.objects.instance [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid a6bb5263-b0c7-4282-8e02-3503fd778e6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.849 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <uuid>a6bb5263-b0c7-4282-8e02-3503fd778e6f</uuid>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <name>instance-0000001c</name>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:name>tempest-LiveMigrationTest-server-1169459074</nova:name>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:04:35</nova:creationTime>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:user uuid="5f75195e56504673bd403ce69cbc28ca">tempest-LiveMigrationTest-1666170212-project-member</nova:user>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:project uuid="f7cb78d24d1a4511a59ced45ccc4a1c7">tempest-LiveMigrationTest-1666170212</nova:project>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        <nova:port uuid="5e772b33-6577-4ba1-b187-e4779ef49ed6">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <entry name="serial">a6bb5263-b0c7-4282-8e02-3503fd778e6f</entry>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <entry name="uuid">a6bb5263-b0c7-4282-8e02-3503fd778e6f</entry>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.config"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:09:b4:ad"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <target dev="tap5e772b33-65"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/console.log" append="off"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:04:35 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:04:35 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:04:35 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:04:35 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.850 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Preparing to wait for external event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.850 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.850 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.851 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.852 2 DEBUG nova.virt.libvirt.vif [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:04:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1169459074',display_name='tempest-LiveMigrationTest-server-1169459074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1169459074',id=28,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-b554y970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:04:33Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=a6bb5263-b0c7-4282-8e02-3503fd778e6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.852 2 DEBUG nova.network.os_vif_util [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Converting VIF {"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.853 2 DEBUG nova.network.os_vif_util [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.853 2 DEBUG os_vif [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.854 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.854 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.857 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e772b33-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.857 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5e772b33-65, col_values=(('external_ids', {'iface-id': '5e772b33-6577-4ba1-b187-e4779ef49ed6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:b4:ad', 'vm-uuid': 'a6bb5263-b0c7-4282-8e02-3503fd778e6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005466012 NetworkManager[51207]: <info>  [1759406675.8607] manager: (tap5e772b33-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.865 2 INFO os_vif [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65')#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.944 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.944 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.945 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] No VIF found with MAC fa:16:3e:09:b4:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:04:35 np0005466012 nova_compute[192063]: 2025-10-02 12:04:35.945 2 INFO nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Using config drive#033[00m
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.410 2 INFO nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Creating config drive at /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.config#033[00m
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.415 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptdj2wb0n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.540 2 DEBUG oslo_concurrency.processutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptdj2wb0n" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:36 np0005466012 kernel: tap5e772b33-65: entered promiscuous mode
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:36 np0005466012 NetworkManager[51207]: <info>  [1759406676.6057] manager: (tap5e772b33-65): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00073|binding|INFO|Claiming lport 5e772b33-6577-4ba1-b187-e4779ef49ed6 for this chassis.
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00074|binding|INFO|5e772b33-6577-4ba1-b187-e4779ef49ed6: Claiming fa:16:3e:09:b4:ad 10.100.0.5
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00075|binding|INFO|Claiming lport 6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f for this chassis.
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00076|binding|INFO|6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f: Claiming fa:16:3e:90:47:36 19.80.0.218
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:36 np0005466012 systemd-udevd[222698]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:36 np0005466012 NetworkManager[51207]: <info>  [1759406676.6489] device (tap5e772b33-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:04:36 np0005466012 NetworkManager[51207]: <info>  [1759406676.6502] device (tap5e772b33-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:04:36 np0005466012 systemd-machined[152114]: New machine qemu-14-instance-0000001c.
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.651 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:47:36 19.80.0.218'], port_security=['fa:16:3e:90:47:36 19.80.0.218'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['5e772b33-6577-4ba1-b187-e4779ef49ed6'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-890758388', 'neutron:cidrs': '19.80.0.218/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-890758388', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f25993-8956-421b-9333-413f987f6201, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.652 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:b4:ad 10.100.0.5'], port_security=['fa:16:3e:09:b4:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-983948384', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a6bb5263-b0c7-4282-8e02-3503fd778e6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-664b6526-6df1-4024-9bab-37218e6c18bd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-983948384', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddfb51e-1095-4b3d-a2dc-f2557cf13b11, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5e772b33-6577-4ba1-b187-e4779ef49ed6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.653 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f in datapath c91b95f8-b43d-450e-bf75-7418a7f0c3c0 bound to our chassis#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.654 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c91b95f8-b43d-450e-bf75-7418a7f0c3c0#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.665 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4865a2-a1e4-42dd-87d4-ca1cfdf6f70a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.666 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc91b95f8-b1 in ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.667 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc91b95f8-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.667 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[65fcd62f-7db8-417a-befc-4cda532936b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.668 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1171c65-5de2-4628-9746-0c4b3ffc2121]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.680 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[d994f3a1-d41a-47f0-ab01-d27e6c92a961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:36 np0005466012 systemd[1]: Started Virtual Machine qemu-14-instance-0000001c.
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00077|binding|INFO|Setting lport 5e772b33-6577-4ba1-b187-e4779ef49ed6 ovn-installed in OVS
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00078|binding|INFO|Setting lport 5e772b33-6577-4ba1-b187-e4779ef49ed6 up in Southbound
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00079|binding|INFO|Setting lport 6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f up in Southbound
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.701 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[63c6a926-8758-4f39-a2be-0c9b7337bc02]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.737 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[32de4e84-721f-4fb5-8ceb-a45de9cd752b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 NetworkManager[51207]: <info>  [1759406676.7427] manager: (tapc91b95f8-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Oct  2 08:04:36 np0005466012 systemd-udevd[222703]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.743 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a23bfc-b57b-471f-ba46-d1afa3a099c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.778 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[93b555a4-8687-4c5a-816e-326038c6a0c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.780 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[96b9548a-08ee-4d76-92b7-52c1dd0b60a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 NetworkManager[51207]: <info>  [1759406676.7989] device (tapc91b95f8-b0): carrier: link connected
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.804 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[99976f98-3ade-4dfe-99b1-a5e385fa069b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.819 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[328b7972-a4fb-47b6-8527-84353c77addd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc91b95f8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:23:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467041, 'reachable_time': 29516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222734, 'error': None, 'target': 'ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.837 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f56996-bf95-42c2-9729-9b62855d7e16]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:2393'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467041, 'tstamp': 467041}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222735, 'error': None, 'target': 'ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.858 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ebf021-3d64-43bf-950e-df18275c1d5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc91b95f8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:23:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467041, 'reachable_time': 29516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222737, 'error': None, 'target': 'ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.893 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bf6e7aea-8352-407e-9ff6-351285d337c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.946 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c43886d4-1324-43d8-bf90-f83ce746ef61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.948 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc91b95f8-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.948 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.949 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc91b95f8-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:36 np0005466012 NetworkManager[51207]: <info>  [1759406676.9511] manager: (tapc91b95f8-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct  2 08:04:36 np0005466012 kernel: tapc91b95f8-b0: entered promiscuous mode
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.955 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc91b95f8-b0, col_values=(('external_ids', {'iface-id': 'efb84ecd-545f-42a1-ad69-585d2998efac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:36Z|00080|binding|INFO|Releasing lport efb84ecd-545f-42a1-ad69-585d2998efac from this chassis (sb_readonly=0)
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.958 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c91b95f8-b43d-450e-bf75-7418a7f0c3c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c91b95f8-b43d-450e-bf75-7418a7f0c3c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.959 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b5fe5e6b-2d82-4f86-a25e-80d2e6cf0d6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.960 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-c91b95f8-b43d-450e-bf75-7418a7f0c3c0
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/c91b95f8-b43d-450e-bf75-7418a7f0c3c0.pid.haproxy
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID c91b95f8-b43d-450e-bf75-7418a7f0c3c0
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:04:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:36.962 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'env', 'PROCESS_TAG=haproxy-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c91b95f8-b43d-450e-bf75-7418a7f0c3c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:04:36 np0005466012 nova_compute[192063]: 2025-10-02 12:04:36.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.011 2 DEBUG nova.compute.manager [req-f2d3a569-291c-48a3-af8c-c998ecf6628c req-5a12f3ac-c18d-4da0-ad0b-c653b1e7842b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.012 2 DEBUG oslo_concurrency.lockutils [req-f2d3a569-291c-48a3-af8c-c998ecf6628c req-5a12f3ac-c18d-4da0-ad0b-c653b1e7842b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.012 2 DEBUG oslo_concurrency.lockutils [req-f2d3a569-291c-48a3-af8c-c998ecf6628c req-5a12f3ac-c18d-4da0-ad0b-c653b1e7842b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.013 2 DEBUG oslo_concurrency.lockutils [req-f2d3a569-291c-48a3-af8c-c998ecf6628c req-5a12f3ac-c18d-4da0-ad0b-c653b1e7842b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.014 2 DEBUG nova.compute.manager [req-f2d3a569-291c-48a3-af8c-c998ecf6628c req-5a12f3ac-c18d-4da0-ad0b-c653b1e7842b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Processing event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:04:37 np0005466012 podman[222775]: 2025-10-02 12:04:37.306020415 +0000 UTC m=+0.044479741 container create 6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:04:37 np0005466012 systemd[1]: Started libpod-conmon-6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4.scope.
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.339 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.341 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406677.3391275, a6bb5263-b0c7-4282-8e02-3503fd778e6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.341 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] VM Started (Lifecycle Event)#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.344 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.348 2 INFO nova.virt.libvirt.driver [-] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Instance spawned successfully.#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.349 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:04:37 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:04:37 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b282380c1f7168be294bb4afeb2317d64ac58d8f5013f2ffa5b50ef3177f4f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.363 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.367 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.372 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.373 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:37 np0005466012 podman[222775]: 2025-10-02 12:04:37.373198141 +0000 UTC m=+0.111657497 container init 6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.373 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.374 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.374 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.374 2 DEBUG nova.virt.libvirt.driver [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:04:37 np0005466012 podman[222775]: 2025-10-02 12:04:37.281422517 +0000 UTC m=+0.019881863 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:04:37 np0005466012 podman[222775]: 2025-10-02 12:04:37.380328942 +0000 UTC m=+0.118788268 container start 6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:04:37 np0005466012 podman[222788]: 2025-10-02 12:04:37.389518227 +0000 UTC m=+0.056146562 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git)
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.399 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.399 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406677.3404841, a6bb5263-b0c7-4282-8e02-3503fd778e6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.400 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:04:37 np0005466012 neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0[222791]: [NOTICE]   (222816) : New worker (222818) forked
Oct  2 08:04:37 np0005466012 neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0[222791]: [NOTICE]   (222816) : Loading success.
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.425 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.428 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406677.3435163, a6bb5263-b0c7-4282-8e02-3503fd778e6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.428 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.447 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5e772b33-6577-4ba1-b187-e4779ef49ed6 in datapath 664b6526-6df1-4024-9bab-37218e6c18bd unbound from our chassis#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.449 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 664b6526-6df1-4024-9bab-37218e6c18bd#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.451 2 INFO nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Took 4.32 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.452 2 DEBUG nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.453 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.459 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.463 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[084f6c43-4351-4827-9621-1a3676530d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.464 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap664b6526-61 in ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.465 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap664b6526-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.465 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[21613cab-445d-4f11-93ae-399c4485c74f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.466 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[afbb9381-fc4d-4eb4-ac3b-63d151fd7deb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.477 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3bda76-7c01-42e4-891c-618f91e35432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.488 2 DEBUG nova.network.neutron [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Updated VIF entry in instance network info cache for port 5e772b33-6577-4ba1-b187-e4779ef49ed6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.489 2 DEBUG nova.network.neutron [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Updating instance_info_cache with network_info: [{"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.502 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb68e47-65d1-4103-a8f6-3b5ea49dfebf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.518 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.532 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7013f131-021a-4db2-9569-ef4e5f255552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.533 2 DEBUG oslo_concurrency.lockutils [req-346518d7-d8d7-45b0-9e87-4b58a39f7e35 req-c7c87b7b-3b0c-42b4-8800-7b17e37548ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.537 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e0564c4e-ac78-4a68-8868-92bba3092dba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 systemd-udevd[222721]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:04:37 np0005466012 NetworkManager[51207]: <info>  [1759406677.5390] manager: (tap664b6526-60): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.568 2 INFO nova.compute.manager [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Took 4.90 seconds to build instance.#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.569 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0020ec7d-3ca6-403f-80ae-850ed7270592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.571 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[91e9c674-8558-4013-8ca7-8c89ed03d71e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.583 2 DEBUG oslo_concurrency.lockutils [None req-705bebd3-ddf2-4b22-b7de-56fb5e5b2000 5f75195e56504673bd403ce69cbc28ca f7cb78d24d1a4511a59ced45ccc4a1c7 - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:37 np0005466012 NetworkManager[51207]: <info>  [1759406677.5910] device (tap664b6526-60): carrier: link connected
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.594 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8f79670c-9e99-411e-ab77-3f046a524534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.611 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[07806a81-5c5a-4d71-bd18-6737ba391f34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap664b6526-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:8c:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467120, 'reachable_time': 24480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222837, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.624 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dd94514e-ac3a-4ccb-9fe4-baaec479405f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:8c2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467120, 'tstamp': 467120}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222838, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.637 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[439a2a88-46bd-4b2c-8e95-6bd692036ae1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap664b6526-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:8c:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467120, 'reachable_time': 24480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222839, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.666 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ebd32f-3c83-4c2c-9b70-dd0acbdc2931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.716 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[adcd4b33-8d67-4582-88be-7ac4428c6d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.717 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap664b6526-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.718 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.718 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap664b6526-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005466012 NetworkManager[51207]: <info>  [1759406677.7540] manager: (tap664b6526-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Oct  2 08:04:37 np0005466012 kernel: tap664b6526-60: entered promiscuous mode
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.756 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap664b6526-60, col_values=(('external_ids', {'iface-id': '2f7dc774-b718-4d9e-9655-fbc5ffa141e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:37Z|00081|binding|INFO|Releasing lport 2f7dc774-b718-4d9e-9655-fbc5ffa141e8 from this chassis (sb_readonly=0)
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.769 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/664b6526-6df1-4024-9bab-37218e6c18bd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/664b6526-6df1-4024-9bab-37218e6c18bd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.770 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4962ee71-4683-4818-b3f3-61f70621e20c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.771 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-664b6526-6df1-4024-9bab-37218e6c18bd
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/664b6526-6df1-4024-9bab-37218e6c18bd.pid.haproxy
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 664b6526-6df1-4024-9bab-37218e6c18bd
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:04:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:37.771 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'env', 'PROCESS_TAG=haproxy-664b6526-6df1-4024-9bab-37218e6c18bd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/664b6526-6df1-4024-9bab-37218e6c18bd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:04:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:37Z|00082|binding|INFO|Releasing lport 2f7dc774-b718-4d9e-9655-fbc5ffa141e8 from this chassis (sb_readonly=0)
Oct  2 08:04:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:37Z|00083|binding|INFO|Releasing lport efb84ecd-545f-42a1-ad69-585d2998efac from this chassis (sb_readonly=0)
Oct  2 08:04:37 np0005466012 nova_compute[192063]: 2025-10-02 12:04:37.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:38 np0005466012 podman[222871]: 2025-10-02 12:04:38.108586529 +0000 UTC m=+0.054453947 container create cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:04:38 np0005466012 systemd[1]: Started libpod-conmon-cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081.scope.
Oct  2 08:04:38 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:04:38 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37fe0fe9f45b32a20b94318ccce048d283ec139cd08eb2a00a6f0a29201b443d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:04:38 np0005466012 podman[222871]: 2025-10-02 12:04:38.079263355 +0000 UTC m=+0.025130823 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:04:38 np0005466012 podman[222871]: 2025-10-02 12:04:38.173479244 +0000 UTC m=+0.119346682 container init cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:04:38 np0005466012 podman[222871]: 2025-10-02 12:04:38.178247772 +0000 UTC m=+0.124115200 container start cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:04:38 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [NOTICE]   (222891) : New worker (222893) forked
Oct  2 08:04:38 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [NOTICE]   (222891) : Loading success.
Oct  2 08:04:39 np0005466012 podman[222903]: 2025-10-02 12:04:39.130419847 +0000 UTC m=+0.045589150 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:04:39 np0005466012 podman[222902]: 2025-10-02 12:04:39.155379145 +0000 UTC m=+0.071637707 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.329 2 DEBUG nova.compute.manager [req-5ce6700b-d2c1-4a30-abd5-965c6a71c8d6 req-2cc8538c-bf24-494b-a090-ea273c551bef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.330 2 DEBUG oslo_concurrency.lockutils [req-5ce6700b-d2c1-4a30-abd5-965c6a71c8d6 req-2cc8538c-bf24-494b-a090-ea273c551bef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.330 2 DEBUG oslo_concurrency.lockutils [req-5ce6700b-d2c1-4a30-abd5-965c6a71c8d6 req-2cc8538c-bf24-494b-a090-ea273c551bef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.331 2 DEBUG oslo_concurrency.lockutils [req-5ce6700b-d2c1-4a30-abd5-965c6a71c8d6 req-2cc8538c-bf24-494b-a090-ea273c551bef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.331 2 DEBUG nova.compute.manager [req-5ce6700b-d2c1-4a30-abd5-965c6a71c8d6 req-2cc8538c-bf24-494b-a090-ea273c551bef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.331 2 WARNING nova.compute.manager [req-5ce6700b-d2c1-4a30-abd5-965c6a71c8d6 req-2cc8538c-bf24-494b-a090-ea273c551bef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received unexpected event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:04:39 np0005466012 nova_compute[192063]: 2025-10-02 12:04:39.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:40 np0005466012 nova_compute[192063]: 2025-10-02 12:04:40.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:43 np0005466012 nova_compute[192063]: 2025-10-02 12:04:43.658 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406668.6568549, 2f45c912-e983-4648-aad2-9053167e0891 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:43 np0005466012 nova_compute[192063]: 2025-10-02 12:04:43.659 2 INFO nova.compute.manager [-] [instance: 2f45c912-e983-4648-aad2-9053167e0891] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:04:43 np0005466012 nova_compute[192063]: 2025-10-02 12:04:43.871 2 DEBUG nova.compute.manager [None req-b7b96a0a-f12d-45a6-89e7-bb24f761bdd2 - - - - - -] [instance: 2f45c912-e983-4648-aad2-9053167e0891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:43.999 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Check if temp file /var/lib/nova/instances/tmp1_i3q1h5 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:44.011 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:44.086 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:44.089 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:44.162 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:44.164 2 DEBUG nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1_i3q1h5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a6bb5263-b0c7-4282-8e02-3503fd778e6f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  2 08:04:44 np0005466012 nova_compute[192063]: 2025-10-02 12:04:44.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:45 np0005466012 nova_compute[192063]: 2025-10-02 12:04:45.176 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406670.175235, 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:45 np0005466012 nova_compute[192063]: 2025-10-02 12:04:45.177 2 INFO nova.compute.manager [-] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:04:45 np0005466012 nova_compute[192063]: 2025-10-02 12:04:45.238 2 DEBUG nova.compute.manager [None req-1b0d2440-8c8a-4709-8971-02865bc49e6f - - - - - -] [instance: 6e6bfda6-0d7c-4526-a660-1cd0f7360e4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:45 np0005466012 nova_compute[192063]: 2025-10-02 12:04:45.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:48 np0005466012 nova_compute[192063]: 2025-10-02 12:04:48.434 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:48 np0005466012 nova_compute[192063]: 2025-10-02 12:04:48.491 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:48 np0005466012 nova_compute[192063]: 2025-10-02 12:04:48.492 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:04:48 np0005466012 nova_compute[192063]: 2025-10-02 12:04:48.550 2 DEBUG oslo_concurrency.processutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:04:49 np0005466012 nova_compute[192063]: 2025-10-02 12:04:49.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:49Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:b4:ad 10.100.0.5
Oct  2 08:04:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:49Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:b4:ad 10.100.0.5
Oct  2 08:04:50 np0005466012 nova_compute[192063]: 2025-10-02 12:04:50.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:51 np0005466012 systemd-logind[827]: New session 31 of user nova.
Oct  2 08:04:51 np0005466012 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:04:51 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:04:51 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:04:51 np0005466012 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:04:51 np0005466012 systemd[222973]: Queued start job for default target Main User Target.
Oct  2 08:04:51 np0005466012 systemd[222973]: Created slice User Application Slice.
Oct  2 08:04:51 np0005466012 systemd[222973]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:04:51 np0005466012 systemd[222973]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:04:51 np0005466012 systemd[222973]: Reached target Paths.
Oct  2 08:04:51 np0005466012 systemd[222973]: Reached target Timers.
Oct  2 08:04:51 np0005466012 systemd[222973]: Starting D-Bus User Message Bus Socket...
Oct  2 08:04:51 np0005466012 systemd[222973]: Starting Create User's Volatile Files and Directories...
Oct  2 08:04:51 np0005466012 systemd[222973]: Finished Create User's Volatile Files and Directories.
Oct  2 08:04:51 np0005466012 systemd[222973]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:04:51 np0005466012 systemd[222973]: Reached target Sockets.
Oct  2 08:04:51 np0005466012 systemd[222973]: Reached target Basic System.
Oct  2 08:04:51 np0005466012 systemd[222973]: Reached target Main User Target.
Oct  2 08:04:51 np0005466012 systemd[222973]: Startup finished in 129ms.
Oct  2 08:04:51 np0005466012 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:04:51 np0005466012 systemd[1]: Started Session 31 of User nova.
Oct  2 08:04:51 np0005466012 systemd[1]: session-31.scope: Deactivated successfully.
Oct  2 08:04:51 np0005466012 systemd-logind[827]: Session 31 logged out. Waiting for processes to exit.
Oct  2 08:04:51 np0005466012 systemd-logind[827]: Removed session 31.
Oct  2 08:04:53 np0005466012 podman[222990]: 2025-10-02 12:04:53.17196607 +0000 UTC m=+0.089722000 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 08:04:53 np0005466012 podman[223016]: 2025-10-02 12:04:53.24711176 +0000 UTC m=+0.051148148 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.476 2 DEBUG nova.compute.manager [req-ca4e401e-6ec4-4ce8-afa7-9264a81c658d req-29f25c08-9cdc-4b01-b50e-20d2d6f513d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.476 2 DEBUG oslo_concurrency.lockutils [req-ca4e401e-6ec4-4ce8-afa7-9264a81c658d req-29f25c08-9cdc-4b01-b50e-20d2d6f513d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.476 2 DEBUG oslo_concurrency.lockutils [req-ca4e401e-6ec4-4ce8-afa7-9264a81c658d req-29f25c08-9cdc-4b01-b50e-20d2d6f513d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.476 2 DEBUG oslo_concurrency.lockutils [req-ca4e401e-6ec4-4ce8-afa7-9264a81c658d req-29f25c08-9cdc-4b01-b50e-20d2d6f513d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.476 2 DEBUG nova.compute.manager [req-ca4e401e-6ec4-4ce8-afa7-9264a81c658d req-29f25c08-9cdc-4b01-b50e-20d2d6f513d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.477 2 DEBUG nova.compute.manager [req-ca4e401e-6ec4-4ce8-afa7-9264a81c658d req-29f25c08-9cdc-4b01-b50e-20d2d6f513d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:04:54 np0005466012 nova_compute[192063]: 2025-10-02 12:04:54.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:55 np0005466012 nova_compute[192063]: 2025-10-02 12:04:55.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.300 2 INFO nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Took 7.75 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.301 2 DEBUG nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.323 2 DEBUG nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1_i3q1h5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a6bb5263-b0c7-4282-8e02-3503fd778e6f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ec7a55ce-f280-419e-8aeb-e7005e351364),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.347 2 DEBUG nova.objects.instance [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lazy-loading 'migration_context' on Instance uuid a6bb5263-b0c7-4282-8e02-3503fd778e6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.348 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.349 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.350 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.368 2 DEBUG nova.virt.libvirt.vif [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:04:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1169459074',display_name='tempest-LiveMigrationTest-server-1169459074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1169459074',id=28,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:04:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-b554y970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:04:37Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=a6bb5263-b0c7-4282-8e02-3503fd778e6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.368 2 DEBUG nova.network.os_vif_util [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converting VIF {"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.369 2 DEBUG nova.network.os_vif_util [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.369 2 DEBUG nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Updating guest XML with vif config: <interface type="ethernet">
Oct  2 08:04:56 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:09:b4:ad"/>
Oct  2 08:04:56 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:04:56 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:04:56 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:04:56 np0005466012 nova_compute[192063]:  <target dev="tap5e772b33-65"/>
Oct  2 08:04:56 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:04:56 np0005466012 nova_compute[192063]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.370 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.586 2 DEBUG nova.compute.manager [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.587 2 DEBUG oslo_concurrency.lockutils [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.588 2 DEBUG oslo_concurrency.lockutils [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.589 2 DEBUG oslo_concurrency.lockutils [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.589 2 DEBUG nova.compute.manager [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.590 2 WARNING nova.compute.manager [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received unexpected event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.591 2 DEBUG nova.compute.manager [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-changed-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.591 2 DEBUG nova.compute.manager [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Refreshing instance network info cache due to event network-changed-5e772b33-6577-4ba1-b187-e4779ef49ed6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.592 2 DEBUG oslo_concurrency.lockutils [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.593 2 DEBUG oslo_concurrency.lockutils [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.593 2 DEBUG nova.network.neutron [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Refreshing network info cache for port 5e772b33-6577-4ba1-b187-e4779ef49ed6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.852 2 DEBUG nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.852 2 INFO nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  2 08:04:56 np0005466012 nova_compute[192063]: 2025-10-02 12:04:56.992 2 INFO nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  2 08:04:57 np0005466012 podman[223040]: 2025-10-02 12:04:57.132626046 +0000 UTC m=+0.049170596 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:04:57 np0005466012 nova_compute[192063]: 2025-10-02 12:04:57.495 2 DEBUG nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:04:57 np0005466012 nova_compute[192063]: 2025-10-02 12:04:57.495 2 DEBUG nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.000 2 DEBUG nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.001 2 DEBUG nova.virt.libvirt.migration [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.222 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406698.2220736, a6bb5263-b0c7-4282-8e02-3503fd778e6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.222 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.248 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.252 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.296 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  2 08:04:58 np0005466012 kernel: tap5e772b33-65 (unregistering): left promiscuous mode
Oct  2 08:04:58 np0005466012 NetworkManager[51207]: <info>  [1759406698.3491] device (tap5e772b33-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00084|binding|INFO|Releasing lport 5e772b33-6577-4ba1-b187-e4779ef49ed6 from this chassis (sb_readonly=0)
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00085|binding|INFO|Setting lport 5e772b33-6577-4ba1-b187-e4779ef49ed6 down in Southbound
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00086|binding|INFO|Releasing lport 6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f from this chassis (sb_readonly=0)
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00087|binding|INFO|Setting lport 6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f down in Southbound
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00088|binding|INFO|Removing iface tap5e772b33-65 ovn-installed in OVS
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00089|binding|INFO|Releasing lport 2f7dc774-b718-4d9e-9655-fbc5ffa141e8 from this chassis (sb_readonly=0)
Oct  2 08:04:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:04:58Z|00090|binding|INFO|Releasing lport efb84ecd-545f-42a1-ad69-585d2998efac from this chassis (sb_readonly=0)
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.375 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:47:36 19.80.0.218'], port_security=['fa:16:3e:90:47:36 19.80.0.218'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['5e772b33-6577-4ba1-b187-e4779ef49ed6'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-890758388', 'neutron:cidrs': '19.80.0.218/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-890758388', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f25993-8956-421b-9333-413f987f6201, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.377 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:b4:ad 10.100.0.5'], port_security=['fa:16:3e:09:b4:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c9f3d658-5c7a-4803-9bbb-01adfb7e88ca'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-983948384', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a6bb5263-b0c7-4282-8e02-3503fd778e6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-664b6526-6df1-4024-9bab-37218e6c18bd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-983948384', 'neutron:project_id': 'f7cb78d24d1a4511a59ced45ccc4a1c7', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a459d514-aab4-4030-9850-e066abdeaccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddfb51e-1095-4b3d-a2dc-f2557cf13b11, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5e772b33-6577-4ba1-b187-e4779ef49ed6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.378 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 6ca21a4d-cad8-4eff-bb5a-78e0705eaf1f in datapath c91b95f8-b43d-450e-bf75-7418a7f0c3c0 unbound from our chassis#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.379 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c91b95f8-b43d-450e-bf75-7418a7f0c3c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.380 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[99bb04ae-6459-4ebe-8eff-9d7205908bd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.380 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0 namespace which is not needed anymore#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:58 np0005466012 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct  2 08:04:58 np0005466012 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Consumed 13.302s CPU time.
Oct  2 08:04:58 np0005466012 systemd-machined[152114]: Machine qemu-14-instance-0000001c terminated.
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0[222791]: [NOTICE]   (222816) : haproxy version is 2.8.14-c23fe91
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0[222791]: [NOTICE]   (222816) : path to executable is /usr/sbin/haproxy
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0[222791]: [ALERT]    (222816) : Current worker (222818) exited with code 143 (Terminated)
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0[222791]: [WARNING]  (222816) : All workers exited. Exiting... (0)
Oct  2 08:04:58 np0005466012 systemd[1]: libpod-6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4.scope: Deactivated successfully.
Oct  2 08:04:58 np0005466012 podman[223090]: 2025-10-02 12:04:58.571233751 +0000 UTC m=+0.100330304 container died 6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.590 2 DEBUG nova.virt.libvirt.guest [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.590 2 INFO nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migration operation has completed#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.590 2 INFO nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] _post_live_migration() is started..#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.593 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.593 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.594 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  2 08:04:58 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4-userdata-shm.mount: Deactivated successfully.
Oct  2 08:04:58 np0005466012 systemd[1]: var-lib-containers-storage-overlay-8b282380c1f7168be294bb4afeb2317d64ac58d8f5013f2ffa5b50ef3177f4f9-merged.mount: Deactivated successfully.
Oct  2 08:04:58 np0005466012 podman[223090]: 2025-10-02 12:04:58.720083492 +0000 UTC m=+0.249180035 container cleanup 6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:04:58 np0005466012 systemd[1]: libpod-conmon-6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4.scope: Deactivated successfully.
Oct  2 08:04:58 np0005466012 podman[223136]: 2025-10-02 12:04:58.78544445 +0000 UTC m=+0.042470277 container remove 6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.790 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b1706e0d-3fd7-4e8e-b6b4-cbded3f7cbf2]: (4, ('Thu Oct  2 12:04:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0 (6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4)\n6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4\nThu Oct  2 12:04:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0 (6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4)\n6f87670588b03d99f5925d7fc09a3b975f0276eadeed02fe89421ac6fc3b58e4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.793 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8425647a-d599-4b53-acf0-41f58d5a2a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.794 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc91b95f8-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:58 np0005466012 kernel: tapc91b95f8-b0: left promiscuous mode
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:58 np0005466012 nova_compute[192063]: 2025-10-02 12:04:58.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.814 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c5542114-8e4d-4bca-a246-3216af984994]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.848 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd4ec20-7255-4d56-b258-048642e11906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.849 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d1f8d6-8979-427d-9a19-6dbc05874b9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.863 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3e51017c-99c1-4e53-9455-8040c2c22e5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467034, 'reachable_time': 35273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223157, 'error': None, 'target': 'ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.866 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c91b95f8-b43d-450e-bf75-7418a7f0c3c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:04:58 np0005466012 systemd[1]: run-netns-ovnmeta\x2dc91b95f8\x2db43d\x2d450e\x2dbf75\x2d7418a7f0c3c0.mount: Deactivated successfully.
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.867 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e9052f60-e87b-4407-9f62-4f68aa85ed38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.867 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5e772b33-6577-4ba1-b187-e4779ef49ed6 in datapath 664b6526-6df1-4024-9bab-37218e6c18bd unbound from our chassis#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.868 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 664b6526-6df1-4024-9bab-37218e6c18bd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.869 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8d44dea3-3113-49f5-aefb-64269ee241bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:58.869 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd namespace which is not needed anymore#033[00m
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [NOTICE]   (222891) : haproxy version is 2.8.14-c23fe91
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [NOTICE]   (222891) : path to executable is /usr/sbin/haproxy
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [WARNING]  (222891) : Exiting Master process...
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [ALERT]    (222891) : Current worker (222893) exited with code 143 (Terminated)
Oct  2 08:04:58 np0005466012 neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd[222887]: [WARNING]  (222891) : All workers exited. Exiting... (0)
Oct  2 08:04:58 np0005466012 systemd[1]: libpod-cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081.scope: Deactivated successfully.
Oct  2 08:04:58 np0005466012 podman[223176]: 2025-10-02 12:04:58.996296139 +0000 UTC m=+0.042135578 container died cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:04:59 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081-userdata-shm.mount: Deactivated successfully.
Oct  2 08:04:59 np0005466012 systemd[1]: var-lib-containers-storage-overlay-37fe0fe9f45b32a20b94318ccce048d283ec139cd08eb2a00a6f0a29201b443d-merged.mount: Deactivated successfully.
Oct  2 08:04:59 np0005466012 podman[223176]: 2025-10-02 12:04:59.026923318 +0000 UTC m=+0.072762757 container cleanup cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:04:59 np0005466012 systemd[1]: libpod-conmon-cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081.scope: Deactivated successfully.
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.057 2 DEBUG nova.compute.manager [req-fd278f24-939b-4db4-afa3-c7fcf3a6aef4 req-016accac-c3cd-4d68-9b2b-4515819d9305 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.057 2 DEBUG oslo_concurrency.lockutils [req-fd278f24-939b-4db4-afa3-c7fcf3a6aef4 req-016accac-c3cd-4d68-9b2b-4515819d9305 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.058 2 DEBUG oslo_concurrency.lockutils [req-fd278f24-939b-4db4-afa3-c7fcf3a6aef4 req-016accac-c3cd-4d68-9b2b-4515819d9305 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.058 2 DEBUG oslo_concurrency.lockutils [req-fd278f24-939b-4db4-afa3-c7fcf3a6aef4 req-016accac-c3cd-4d68-9b2b-4515819d9305 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.058 2 DEBUG nova.compute.manager [req-fd278f24-939b-4db4-afa3-c7fcf3a6aef4 req-016accac-c3cd-4d68-9b2b-4515819d9305 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.058 2 DEBUG nova.compute.manager [req-fd278f24-939b-4db4-afa3-c7fcf3a6aef4 req-016accac-c3cd-4d68-9b2b-4515819d9305 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:04:59 np0005466012 podman[223204]: 2025-10-02 12:04:59.088366251 +0000 UTC m=+0.040982276 container remove cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.095 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3dfcbebd-2e53-4f37-be63-ad5040007230]: (4, ('Thu Oct  2 12:04:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd (cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081)\ncc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081\nThu Oct  2 12:04:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd (cc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081)\ncc48d8bb332fd36fb9fe87e366678ac6ca28f247bb067b039d10e351ac738081\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.096 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8df318ba-1311-451a-a813-777dbfdf008d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.097 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap664b6526-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:04:59 np0005466012 kernel: tap664b6526-60: left promiscuous mode
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.116 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d188ba-a093-4163-877c-a451fbdb4c4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.156 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[62c6ae55-5c60-4f42-a798-1871ea90edef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.157 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[025a1db0-eafd-4d85-8f93-338198305281]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.173 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[50082075-e01e-4c6c-822d-3e63e8f3afd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467113, 'reachable_time': 25910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223225, 'error': None, 'target': 'ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.175 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-664b6526-6df1-4024-9bab-37218e6c18bd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:04:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:04:59.175 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a80279-56d9-4ab2-aac7-8dd25085a0a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.323 2 DEBUG nova.network.neutron [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Updated VIF entry in instance network info cache for port 5e772b33-6577-4ba1-b187-e4779ef49ed6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.324 2 DEBUG nova.network.neutron [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Updating instance_info_cache with network_info: [{"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.355 2 DEBUG oslo_concurrency.lockutils [req-6d71666d-720e-4d87-a2e7-ee1e568768d5 req-978093d6-1a81-4e9a-9254-08c8571dc7d6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a6bb5263-b0c7-4282-8e02-3503fd778e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:04:59 np0005466012 systemd[1]: run-netns-ovnmeta\x2d664b6526\x2d6df1\x2d4024\x2d9bab\x2d37218e6c18bd.mount: Deactivated successfully.
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.991 2 DEBUG nova.compute.manager [req-56dc7f29-2bc1-47e3-9d54-1a97469cbd03 req-e96644b2-f50b-4744-8dd3-2024e3619377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.992 2 DEBUG oslo_concurrency.lockutils [req-56dc7f29-2bc1-47e3-9d54-1a97469cbd03 req-e96644b2-f50b-4744-8dd3-2024e3619377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.992 2 DEBUG oslo_concurrency.lockutils [req-56dc7f29-2bc1-47e3-9d54-1a97469cbd03 req-e96644b2-f50b-4744-8dd3-2024e3619377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.992 2 DEBUG oslo_concurrency.lockutils [req-56dc7f29-2bc1-47e3-9d54-1a97469cbd03 req-e96644b2-f50b-4744-8dd3-2024e3619377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.992 2 DEBUG nova.compute.manager [req-56dc7f29-2bc1-47e3-9d54-1a97469cbd03 req-e96644b2-f50b-4744-8dd3-2024e3619377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:04:59 np0005466012 nova_compute[192063]: 2025-10-02 12:04:59.992 2 DEBUG nova.compute.manager [req-56dc7f29-2bc1-47e3-9d54-1a97469cbd03 req-e96644b2-f50b-4744-8dd3-2024e3619377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-unplugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:05:00 np0005466012 podman[223226]: 2025-10-02 12:05:00.143553182 +0000 UTC m=+0.063297344 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.490 2 DEBUG nova.network.neutron [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Activated binding for port 5e772b33-6577-4ba1-b187-e4779ef49ed6 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.491 2 DEBUG nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.491 2 DEBUG nova.virt.libvirt.vif [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:04:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1169459074',display_name='tempest-LiveMigrationTest-server-1169459074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1169459074',id=28,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:04:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f7cb78d24d1a4511a59ced45ccc4a1c7',ramdisk_id='',reservation_id='r-b554y970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1666170212',owner_user_name='tempest-LiveMigrationTest-1666170212-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:04:41Z,user_data=None,user_id='5f75195e56504673bd403ce69cbc28ca',uuid=a6bb5263-b0c7-4282-8e02-3503fd778e6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.492 2 DEBUG nova.network.os_vif_util [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converting VIF {"id": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "address": "fa:16:3e:09:b4:ad", "network": {"id": "664b6526-6df1-4024-9bab-37218e6c18bd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2017832683-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7cb78d24d1a4511a59ced45ccc4a1c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e772b33-65", "ovs_interfaceid": "5e772b33-6577-4ba1-b187-e4779ef49ed6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.492 2 DEBUG nova.network.os_vif_util [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.493 2 DEBUG os_vif [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e772b33-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.499 2 INFO os_vif [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:b4:ad,bridge_name='br-int',has_traffic_filtering=True,id=5e772b33-6577-4ba1-b187-e4779ef49ed6,network=Network(664b6526-6df1-4024-9bab-37218e6c18bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5e772b33-65')#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.499 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.500 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.500 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.500 2 DEBUG nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.500 2 INFO nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Deleting instance files /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f_del#033[00m
Oct  2 08:05:00 np0005466012 nova_compute[192063]: 2025-10-02 12:05:00.501 2 INFO nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Deletion of /var/lib/nova/instances/a6bb5263-b0c7-4282-8e02-3503fd778e6f_del complete#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.421 2 DEBUG nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.421 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.422 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.422 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.422 2 DEBUG nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.423 2 WARNING nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received unexpected event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.423 2 DEBUG nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.423 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.424 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.424 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.424 2 DEBUG nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.425 2 WARNING nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received unexpected event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.425 2 DEBUG nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.425 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.426 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.426 2 DEBUG oslo_concurrency.lockutils [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.426 2 DEBUG nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:01 np0005466012 nova_compute[192063]: 2025-10-02 12:05:01.427 2 WARNING nova.compute.manager [req-69009e74-67f3-43fe-924d-b1c4692f0efc req-dbaea575-db9f-4b8f-aeb6-0f463887834e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received unexpected event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:05:01 np0005466012 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:05:01 np0005466012 systemd[222973]: Activating special unit Exit the Session...
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped target Main User Target.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped target Basic System.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped target Paths.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped target Sockets.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped target Timers.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:05:01 np0005466012 systemd[222973]: Closed D-Bus User Message Bus Socket.
Oct  2 08:05:01 np0005466012 systemd[222973]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:05:01 np0005466012 systemd[222973]: Removed slice User Application Slice.
Oct  2 08:05:01 np0005466012 systemd[222973]: Reached target Shutdown.
Oct  2 08:05:01 np0005466012 systemd[222973]: Finished Exit the Session.
Oct  2 08:05:01 np0005466012 systemd[222973]: Reached target Exit the Session.
Oct  2 08:05:01 np0005466012 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:05:01 np0005466012 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:05:01 np0005466012 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:05:01 np0005466012 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:05:01 np0005466012 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:05:01 np0005466012 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:05:01 np0005466012 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:05:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:02.116 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:03 np0005466012 nova_compute[192063]: 2025-10-02 12:05:03.705 2 DEBUG nova.compute.manager [req-1c2f7be2-f2ac-4b9b-8023-e6e3e1e5ad1f req-87a339ef-eac1-414f-95f4-f3afacaf9beb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:03 np0005466012 nova_compute[192063]: 2025-10-02 12:05:03.705 2 DEBUG oslo_concurrency.lockutils [req-1c2f7be2-f2ac-4b9b-8023-e6e3e1e5ad1f req-87a339ef-eac1-414f-95f4-f3afacaf9beb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:03 np0005466012 nova_compute[192063]: 2025-10-02 12:05:03.705 2 DEBUG oslo_concurrency.lockutils [req-1c2f7be2-f2ac-4b9b-8023-e6e3e1e5ad1f req-87a339ef-eac1-414f-95f4-f3afacaf9beb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:03 np0005466012 nova_compute[192063]: 2025-10-02 12:05:03.706 2 DEBUG oslo_concurrency.lockutils [req-1c2f7be2-f2ac-4b9b-8023-e6e3e1e5ad1f req-87a339ef-eac1-414f-95f4-f3afacaf9beb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:03 np0005466012 nova_compute[192063]: 2025-10-02 12:05:03.706 2 DEBUG nova.compute.manager [req-1c2f7be2-f2ac-4b9b-8023-e6e3e1e5ad1f req-87a339ef-eac1-414f-95f4-f3afacaf9beb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] No waiting events found dispatching network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:05:03 np0005466012 nova_compute[192063]: 2025-10-02 12:05:03.706 2 WARNING nova.compute.manager [req-1c2f7be2-f2ac-4b9b-8023-e6e3e1e5ad1f req-87a339ef-eac1-414f-95f4-f3afacaf9beb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Received unexpected event network-vif-plugged-5e772b33-6577-4ba1-b187-e4779ef49ed6 for instance with vm_state active and task_state migrating.#033[00m
Oct  2 08:05:04 np0005466012 nova_compute[192063]: 2025-10-02 12:05:04.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:05 np0005466012 nova_compute[192063]: 2025-10-02 12:05:05.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:06 np0005466012 podman[223248]: 2025-10-02 12:05:06.143658103 +0000 UTC m=+0.060440168 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.465 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.467 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.467 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "a6bb5263-b0c7-4282-8e02-3503fd778e6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.506 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.506 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.507 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.507 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.695 2 WARNING nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.697 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5755MB free_disk=73.46621322631836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.697 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.697 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.750 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Migration for instance a6bb5263-b0c7-4282-8e02-3503fd778e6f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.798 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.964 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Migration ec7a55ce-f280-419e-8aeb-e7005e351364 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.964 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:05:06 np0005466012 nova_compute[192063]: 2025-10-02 12:05:06.965 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:05:07 np0005466012 nova_compute[192063]: 2025-10-02 12:05:07.080 2 DEBUG nova.compute.provider_tree [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:08 np0005466012 podman[223269]: 2025-10-02 12:05:08.16636815 +0000 UTC m=+0.075983943 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.746 2 DEBUG nova.scheduler.client.report [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.789 2 DEBUG nova.compute.resource_tracker [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.790 2 DEBUG oslo_concurrency.lockutils [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.825 2 INFO nova.compute.manager [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.842 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:09 np0005466012 nova_compute[192063]: 2025-10-02 12:05:09.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:10 np0005466012 podman[223291]: 2025-10-02 12:05:10.14137055 +0000 UTC m=+0.058337891 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:10 np0005466012 podman[223292]: 2025-10-02 12:05:10.171470985 +0000 UTC m=+0.072024977 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:05:10 np0005466012 nova_compute[192063]: 2025-10-02 12:05:10.211 2 INFO nova.scheduler.client.report [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] Deleted allocation for migration ec7a55ce-f280-419e-8aeb-e7005e351364#033[00m
Oct  2 08:05:10 np0005466012 nova_compute[192063]: 2025-10-02 12:05:10.212 2 DEBUG nova.virt.libvirt.driver [None req-3cd747b3-fa97-4d66-8972-f85e23e31784 ba082148882647d48482e0be9e06c582 9ec1cf31f7044579b02c7077aa7d0973 - - default default] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  2 08:05:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:10.522 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:10.523 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:05:10 np0005466012 nova_compute[192063]: 2025-10-02 12:05:10.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:10 np0005466012 nova_compute[192063]: 2025-10-02 12:05:10.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:10 np0005466012 nova_compute[192063]: 2025-10-02 12:05:10.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:10 np0005466012 nova_compute[192063]: 2025-10-02 12:05:10.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:11 np0005466012 nova_compute[192063]: 2025-10-02 12:05:11.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:11 np0005466012 nova_compute[192063]: 2025-10-02 12:05:11.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:05:12 np0005466012 nova_compute[192063]: 2025-10-02 12:05:12.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:12 np0005466012 nova_compute[192063]: 2025-10-02 12:05:12.842 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:12 np0005466012 nova_compute[192063]: 2025-10-02 12:05:12.843 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:12 np0005466012 nova_compute[192063]: 2025-10-02 12:05:12.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:12 np0005466012 nova_compute[192063]: 2025-10-02 12:05:12.844 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.020 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.021 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5771MB free_disk=73.46621322631836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.022 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.022 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.073 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.074 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.137 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.153 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.154 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.154 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.590 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406698.5891767, a6bb5263-b0c7-4282-8e02-3503fd778e6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.591 2 INFO nova.compute.manager [-] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:05:13 np0005466012 nova_compute[192063]: 2025-10-02 12:05:13.621 2 DEBUG nova.compute.manager [None req-dbed5098-fd49-497b-9c61-c8a721ca8b47 - - - - - -] [instance: a6bb5263-b0c7-4282-8e02-3503fd778e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:14 np0005466012 nova_compute[192063]: 2025-10-02 12:05:14.153 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:14 np0005466012 nova_compute[192063]: 2025-10-02 12:05:14.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:15 np0005466012 nova_compute[192063]: 2025-10-02 12:05:15.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:15 np0005466012 nova_compute[192063]: 2025-10-02 12:05:15.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:05:15 np0005466012 nova_compute[192063]: 2025-10-02 12:05:15.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:05:15 np0005466012 nova_compute[192063]: 2025-10-02 12:05:15.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:05:15 np0005466012 nova_compute[192063]: 2025-10-02 12:05:15.854 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:05:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:05:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:19.524 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:19 np0005466012 nova_compute[192063]: 2025-10-02 12:05:19.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:20 np0005466012 nova_compute[192063]: 2025-10-02 12:05:20.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:24 np0005466012 podman[223336]: 2025-10-02 12:05:24.149494882 +0000 UTC m=+0.062231625 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:05:24 np0005466012 podman[223337]: 2025-10-02 12:05:24.202532751 +0000 UTC m=+0.114523524 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:05:24 np0005466012 nova_compute[192063]: 2025-10-02 12:05:24.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:25 np0005466012 nova_compute[192063]: 2025-10-02 12:05:25.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:28 np0005466012 podman[223386]: 2025-10-02 12:05:28.140593093 +0000 UTC m=+0.060018857 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:29 np0005466012 nova_compute[192063]: 2025-10-02 12:05:29.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:30 np0005466012 nova_compute[192063]: 2025-10-02 12:05:30.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:31 np0005466012 podman[223406]: 2025-10-02 12:05:31.158079555 +0000 UTC m=+0.068804022 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:05:34 np0005466012 nova_compute[192063]: 2025-10-02 12:05:34.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:35 np0005466012 nova_compute[192063]: 2025-10-02 12:05:35.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:37 np0005466012 podman[223426]: 2025-10-02 12:05:37.154832865 +0000 UTC m=+0.063065619 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd)
Oct  2 08:05:39 np0005466012 podman[223447]: 2025-10-02 12:05:39.167753708 +0000 UTC m=+0.075865310 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, config_id=edpm, maintainer=Red Hat, Inc.)
Oct  2 08:05:39 np0005466012 nova_compute[192063]: 2025-10-02 12:05:39.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:40 np0005466012 nova_compute[192063]: 2025-10-02 12:05:40.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:41 np0005466012 podman[223469]: 2025-10-02 12:05:41.160027521 +0000 UTC m=+0.072155210 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:05:41 np0005466012 podman[223468]: 2025-10-02 12:05:41.165592049 +0000 UTC m=+0.077719718 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.740 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.741 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.770 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.949 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.950 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.961 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:05:41 np0005466012 nova_compute[192063]: 2025-10-02 12:05:41.961 2 INFO nova.compute.claims [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.255 2 DEBUG nova.compute.provider_tree [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.284 2 DEBUG nova.scheduler.client.report [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.335 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.335 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.454 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.455 2 DEBUG nova.network.neutron [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.487 2 INFO nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.518 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.758 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.760 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.760 2 INFO nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Creating image(s)#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.761 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "/var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.762 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "/var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.763 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "/var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.787 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.848 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.849 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.850 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.870 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.938 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.939 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.978 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.979 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:42 np0005466012 nova_compute[192063]: 2025-10-02 12:05:42.979 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.065 2 DEBUG nova.policy [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdc7ec1af4d8410db0b4592293549806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '87e7399e976c40bc84f320ed0d052ac6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.068 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.069 2 DEBUG nova.virt.disk.api [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Checking if we can resize image /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.069 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.138 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.139 2 DEBUG nova.virt.disk.api [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Cannot resize image /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.139 2 DEBUG nova.objects.instance [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'migration_context' on Instance uuid 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.167 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.168 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Ensure instance console log exists: /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.168 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.169 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:43 np0005466012 nova_compute[192063]: 2025-10-02 12:05:43.169 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:44 np0005466012 nova_compute[192063]: 2025-10-02 12:05:44.222 2 DEBUG nova.network.neutron [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Successfully created port: dad72ac3-1117-4c05-9056-6371bf8ee649 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:05:44 np0005466012 nova_compute[192063]: 2025-10-02 12:05:44.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:45 np0005466012 nova_compute[192063]: 2025-10-02 12:05:45.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:47 np0005466012 nova_compute[192063]: 2025-10-02 12:05:47.810 2 DEBUG nova.network.neutron [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Successfully updated port: dad72ac3-1117-4c05-9056-6371bf8ee649 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:05:47 np0005466012 nova_compute[192063]: 2025-10-02 12:05:47.842 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "refresh_cache-3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:47 np0005466012 nova_compute[192063]: 2025-10-02 12:05:47.842 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquired lock "refresh_cache-3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:47 np0005466012 nova_compute[192063]: 2025-10-02 12:05:47.843 2 DEBUG nova.network.neutron [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:05:48 np0005466012 nova_compute[192063]: 2025-10-02 12:05:48.147 2 DEBUG nova.compute.manager [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-changed-dad72ac3-1117-4c05-9056-6371bf8ee649 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:48 np0005466012 nova_compute[192063]: 2025-10-02 12:05:48.147 2 DEBUG nova.compute.manager [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Refreshing instance network info cache due to event network-changed-dad72ac3-1117-4c05-9056-6371bf8ee649. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:05:48 np0005466012 nova_compute[192063]: 2025-10-02 12:05:48.148 2 DEBUG oslo_concurrency.lockutils [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:05:48 np0005466012 nova_compute[192063]: 2025-10-02 12:05:48.372 2 DEBUG nova.network.neutron [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:05:49 np0005466012 nova_compute[192063]: 2025-10-02 12:05:49.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:50 np0005466012 nova_compute[192063]: 2025-10-02 12:05:50.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.398 2 DEBUG nova.network.neutron [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Updating instance_info_cache with network_info: [{"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.463 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Releasing lock "refresh_cache-3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.464 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Instance network_info: |[{"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.464 2 DEBUG oslo_concurrency.lockutils [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.465 2 DEBUG nova.network.neutron [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Refreshing network info cache for port dad72ac3-1117-4c05-9056-6371bf8ee649 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.471 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Start _get_guest_xml network_info=[{"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.478 2 WARNING nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.492 2 DEBUG nova.virt.libvirt.host [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.493 2 DEBUG nova.virt.libvirt.host [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.497 2 DEBUG nova.virt.libvirt.host [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.498 2 DEBUG nova.virt.libvirt.host [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.500 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.500 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.501 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.501 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.502 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.502 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.503 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.503 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.504 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.505 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.505 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.506 2 DEBUG nova.virt.hardware [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.513 2 DEBUG nova.virt.libvirt.vif [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:05:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1612577643',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1612577643',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1612577643',id=30,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-ekqyotg8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:05:42Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=3ef96c40-f041-4cfe-a0e0-26fe8e44f5af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.513 2 DEBUG nova.network.os_vif_util [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.514 2 DEBUG nova.network.os_vif_util [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.516 2 DEBUG nova.objects.instance [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.552 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <uuid>3ef96c40-f041-4cfe-a0e0-26fe8e44f5af</uuid>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <name>instance-0000001e</name>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1612577643</nova:name>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:05:51</nova:creationTime>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:user uuid="cdc7ec1af4d8410db0b4592293549806">tempest-ImagesOneServerNegativeTestJSON-507683469-project-member</nova:user>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:project uuid="87e7399e976c40bc84f320ed0d052ac6">tempest-ImagesOneServerNegativeTestJSON-507683469</nova:project>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        <nova:port uuid="dad72ac3-1117-4c05-9056-6371bf8ee649">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <entry name="serial">3ef96c40-f041-4cfe-a0e0-26fe8e44f5af</entry>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <entry name="uuid">3ef96c40-f041-4cfe-a0e0-26fe8e44f5af</entry>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.config"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:1d:c4:de"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <target dev="tapdad72ac3-11"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/console.log" append="off"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:05:51 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:05:51 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:05:51 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:05:51 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.554 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Preparing to wait for external event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.554 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.554 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.555 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.555 2 DEBUG nova.virt.libvirt.vif [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:05:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1612577643',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1612577643',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1612577643',id=30,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-ekqyotg8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:05:42Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=3ef96c40-f041-4cfe-a0e0-26fe8e44f5af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.555 2 DEBUG nova.network.os_vif_util [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.556 2 DEBUG nova.network.os_vif_util [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.556 2 DEBUG os_vif [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.557 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdad72ac3-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdad72ac3-11, col_values=(('external_ids', {'iface-id': 'dad72ac3-1117-4c05-9056-6371bf8ee649', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:c4:de', 'vm-uuid': '3ef96c40-f041-4cfe-a0e0-26fe8e44f5af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:51 np0005466012 NetworkManager[51207]: <info>  [1759406751.6031] manager: (tapdad72ac3-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.609 2 INFO os_vif [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11')#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.883 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.884 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.885 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] No VIF found with MAC fa:16:3e:1d:c4:de, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:05:51 np0005466012 nova_compute[192063]: 2025-10-02 12:05:51.886 2 INFO nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Using config drive#033[00m
Oct  2 08:05:52 np0005466012 nova_compute[192063]: 2025-10-02 12:05:52.597 2 INFO nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Creating config drive at /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.config#033[00m
Oct  2 08:05:52 np0005466012 nova_compute[192063]: 2025-10-02 12:05:52.602 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt0rpv3lo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:05:52 np0005466012 nova_compute[192063]: 2025-10-02 12:05:52.726 2 DEBUG oslo_concurrency.processutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt0rpv3lo" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:05:52 np0005466012 kernel: tapdad72ac3-11: entered promiscuous mode
Oct  2 08:05:52 np0005466012 NetworkManager[51207]: <info>  [1759406752.7808] manager: (tapdad72ac3-11): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Oct  2 08:05:52 np0005466012 systemd-udevd[223543]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:05:52 np0005466012 nova_compute[192063]: 2025-10-02 12:05:52.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:52 np0005466012 nova_compute[192063]: 2025-10-02 12:05:52.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:05:52Z|00091|binding|INFO|Claiming lport dad72ac3-1117-4c05-9056-6371bf8ee649 for this chassis.
Oct  2 08:05:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:05:52Z|00092|binding|INFO|dad72ac3-1117-4c05-9056-6371bf8ee649: Claiming fa:16:3e:1d:c4:de 10.100.0.8
Oct  2 08:05:52 np0005466012 NetworkManager[51207]: <info>  [1759406752.8361] device (tapdad72ac3-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:05:52 np0005466012 NetworkManager[51207]: <info>  [1759406752.8374] device (tapdad72ac3-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.844 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:c4:de 10.100.0.8'], port_security=['fa:16:3e:1d:c4:de 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3ef96c40-f041-4cfe-a0e0-26fe8e44f5af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-982b406e-0686-44db-8945-39e0f57e4781', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87e7399e976c40bc84f320ed0d052ac6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '410de2f3-62e2-482c-a480-7655c2811e48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f46c1e1f-04ef-471b-85c6-c4415ad3e6bb, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=dad72ac3-1117-4c05-9056-6371bf8ee649) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.847 103246 INFO neutron.agent.ovn.metadata.agent [-] Port dad72ac3-1117-4c05-9056-6371bf8ee649 in datapath 982b406e-0686-44db-8945-39e0f57e4781 bound to our chassis#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.849 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 982b406e-0686-44db-8945-39e0f57e4781#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.862 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[26d625b1-2e08-4bb3-b255-ca1eab7e8709]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.864 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap982b406e-01 in ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.866 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap982b406e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.866 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb575f2-4abf-45e0-b730-7487aabfca0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.867 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6c5e10c4-8b0a-42af-ad6f-9033ca2684e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 systemd-machined[152114]: New machine qemu-15-instance-0000001e.
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.876 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[95b38345-e4b7-4a69-8740-7d3defd2db8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:05:52Z|00093|binding|INFO|Setting lport dad72ac3-1117-4c05-9056-6371bf8ee649 ovn-installed in OVS
Oct  2 08:05:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:05:52Z|00094|binding|INFO|Setting lport dad72ac3-1117-4c05-9056-6371bf8ee649 up in Southbound
Oct  2 08:05:52 np0005466012 nova_compute[192063]: 2025-10-02 12:05:52.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:52 np0005466012 systemd[1]: Started Virtual Machine qemu-15-instance-0000001e.
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.898 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f8da4b-c360-45ee-9642-136b1e01860d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.926 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[64724593-aeff-4974-96b9-7398b4f10ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.932 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[79748287-0de5-4570-9259-9aef1a65a67b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 systemd-udevd[223545]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:05:52 np0005466012 NetworkManager[51207]: <info>  [1759406752.9345] manager: (tap982b406e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.964 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4734b1-7519-47f8-aab3-093259ac0ec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.971 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9720478f-8b67-4b54-a14d-2fdffa55ca22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:52 np0005466012 NetworkManager[51207]: <info>  [1759406752.9904] device (tap982b406e-00): carrier: link connected
Oct  2 08:05:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:52.994 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[bf541123-cea7-4c6f-94f0-2698d08a697d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.008 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cf764ed5-75bb-46aa-8f0c-92dfe1a9eae8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap982b406e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:e2:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474660, 'reachable_time': 35653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223579, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.023 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f3199b2a-dc8d-436d-a643-84b67da679aa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:e21f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474660, 'tstamp': 474660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223580, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.039 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[99d35d0f-9428-43f8-8e2d-fa1e45c2d172]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap982b406e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:e2:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474660, 'reachable_time': 35653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223581, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.066 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[22ecd027-595c-40f6-a830-125cfc8f96d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.143 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0050c1cc-c4b4-4bba-ab39-64f53dd6d318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.145 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap982b406e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.145 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.146 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap982b406e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:53 np0005466012 NetworkManager[51207]: <info>  [1759406753.1498] manager: (tap982b406e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:53 np0005466012 kernel: tap982b406e-00: entered promiscuous mode
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.156 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap982b406e-00, col_values=(('external_ids', {'iface-id': 'e7c44940-f7d8-482e-a63d-10c99ba9de76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:05:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:05:53Z|00095|binding|INFO|Releasing lport e7c44940-f7d8-482e-a63d-10c99ba9de76 from this chassis (sb_readonly=0)
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.186 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.187 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[75098bd9-c9ce-4170-81be-a0e21a5e309e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.187 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-982b406e-0686-44db-8945-39e0f57e4781
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/982b406e-0686-44db-8945-39e0f57e4781.pid.haproxy
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 982b406e-0686-44db-8945-39e0f57e4781
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:05:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:05:53.188 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'env', 'PROCESS_TAG=haproxy-982b406e-0686-44db-8945-39e0f57e4781', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/982b406e-0686-44db-8945-39e0f57e4781.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:05:53 np0005466012 podman[223620]: 2025-10-02 12:05:53.534078329 +0000 UTC m=+0.045792875 container create 2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:05:53 np0005466012 systemd[1]: Started libpod-conmon-2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021.scope.
Oct  2 08:05:53 np0005466012 podman[223620]: 2025-10-02 12:05:53.511478275 +0000 UTC m=+0.023192841 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:05:53 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:05:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44993b66c02431e768a736fb1d55e7e6e772948e3a2e1bd8320a33b4f991f28d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:05:53 np0005466012 podman[223620]: 2025-10-02 12:05:53.628151805 +0000 UTC m=+0.139866371 container init 2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:53 np0005466012 podman[223620]: 2025-10-02 12:05:53.635441721 +0000 UTC m=+0.147156277 container start 2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:53 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [NOTICE]   (223639) : New worker (223641) forked
Oct  2 08:05:53 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [NOTICE]   (223639) : Loading success.
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.733 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406753.7334607, 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.734 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] VM Started (Lifecycle Event)#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.784 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.789 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406753.7357855, 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.790 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.887 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.890 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.906 2 DEBUG nova.network.neutron [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Updated VIF entry in instance network info cache for port dad72ac3-1117-4c05-9056-6371bf8ee649. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.907 2 DEBUG nova.network.neutron [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Updating instance_info_cache with network_info: [{"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.935 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:05:53 np0005466012 nova_compute[192063]: 2025-10-02 12:05:53.939 2 DEBUG oslo_concurrency.lockutils [req-b241fe25-5b11-4107-9d61-0251cd3072eb req-1d21509a-528b-4f00-be9f-46f1d5536463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:05:54 np0005466012 nova_compute[192063]: 2025-10-02 12:05:54.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:55 np0005466012 podman[223650]: 2025-10-02 12:05:55.18901704 +0000 UTC m=+0.092836293 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:05:55 np0005466012 podman[223651]: 2025-10-02 12:05:55.214619685 +0000 UTC m=+0.114994736 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:05:56 np0005466012 nova_compute[192063]: 2025-10-02 12:05:56.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.153 2 DEBUG nova.compute.manager [req-20d152f4-1204-4061-be7d-3346449b7440 req-425a9daf-c335-43d5-9216-345efbcbd2ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.153 2 DEBUG oslo_concurrency.lockutils [req-20d152f4-1204-4061-be7d-3346449b7440 req-425a9daf-c335-43d5-9216-345efbcbd2ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.154 2 DEBUG oslo_concurrency.lockutils [req-20d152f4-1204-4061-be7d-3346449b7440 req-425a9daf-c335-43d5-9216-345efbcbd2ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.154 2 DEBUG oslo_concurrency.lockutils [req-20d152f4-1204-4061-be7d-3346449b7440 req-425a9daf-c335-43d5-9216-345efbcbd2ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.155 2 DEBUG nova.compute.manager [req-20d152f4-1204-4061-be7d-3346449b7440 req-425a9daf-c335-43d5-9216-345efbcbd2ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Processing event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.156 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.160 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406758.1598873, 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.160 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.163 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.168 2 INFO nova.virt.libvirt.driver [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Instance spawned successfully.#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.169 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.188 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.195 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.199 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.200 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.200 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.201 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.201 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.202 2 DEBUG nova.virt.libvirt.driver [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.230 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.301 2 INFO nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Took 15.54 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.302 2 DEBUG nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.408 2 INFO nova.compute.manager [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Took 16.54 seconds to build instance.#033[00m
Oct  2 08:05:58 np0005466012 nova_compute[192063]: 2025-10-02 12:05:58.436 2 DEBUG oslo_concurrency.lockutils [None req-ac1d354e-4e57-46d6-8e15-b2b078905a4b cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:05:59 np0005466012 podman[223699]: 2025-10-02 12:05:59.203459135 +0000 UTC m=+0.112934181 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  2 08:05:59 np0005466012 nova_compute[192063]: 2025-10-02 12:05:59.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:00 np0005466012 nova_compute[192063]: 2025-10-02 12:06:00.897 2 DEBUG nova.compute.manager [req-c8f29d8f-7c79-44bb-afae-ae2d6ff560bc req-2d19e25e-08fd-43a7-acbe-d142868961ed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:00 np0005466012 nova_compute[192063]: 2025-10-02 12:06:00.898 2 DEBUG oslo_concurrency.lockutils [req-c8f29d8f-7c79-44bb-afae-ae2d6ff560bc req-2d19e25e-08fd-43a7-acbe-d142868961ed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:00 np0005466012 nova_compute[192063]: 2025-10-02 12:06:00.899 2 DEBUG oslo_concurrency.lockutils [req-c8f29d8f-7c79-44bb-afae-ae2d6ff560bc req-2d19e25e-08fd-43a7-acbe-d142868961ed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:00 np0005466012 nova_compute[192063]: 2025-10-02 12:06:00.899 2 DEBUG oslo_concurrency.lockutils [req-c8f29d8f-7c79-44bb-afae-ae2d6ff560bc req-2d19e25e-08fd-43a7-acbe-d142868961ed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:00 np0005466012 nova_compute[192063]: 2025-10-02 12:06:00.899 2 DEBUG nova.compute.manager [req-c8f29d8f-7c79-44bb-afae-ae2d6ff560bc req-2d19e25e-08fd-43a7-acbe-d142868961ed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] No waiting events found dispatching network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:00 np0005466012 nova_compute[192063]: 2025-10-02 12:06:00.900 2 WARNING nova.compute.manager [req-c8f29d8f-7c79-44bb-afae-ae2d6ff560bc req-2d19e25e-08fd-43a7-acbe-d142868961ed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received unexpected event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:06:01 np0005466012 nova_compute[192063]: 2025-10-02 12:06:01.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:01 np0005466012 nova_compute[192063]: 2025-10-02 12:06:01.620 2 DEBUG nova.compute.manager [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:01 np0005466012 nova_compute[192063]: 2025-10-02 12:06:01.694 2 INFO nova.compute.manager [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] instance snapshotting#033[00m
Oct  2 08:06:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:02.117 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:02 np0005466012 podman[223718]: 2025-10-02 12:06:02.153712888 +0000 UTC m=+0.072361415 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.472 2 INFO nova.virt.libvirt.driver [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Beginning live snapshot process#033[00m
Oct  2 08:06:02 np0005466012 virtqemud[191783]: invalid argument: disk vda does not have an active block job
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.836 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.939 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.940 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.943 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk --force-share --output=json -f qcow2" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.945 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:02 np0005466012 nova_compute[192063]: 2025-10-02 12:06:02.975 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.023 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.040 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.092 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.093 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpsq2beig9/a1be2bb3f6d244c48f8a49078c6d9b0f.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.153 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.154 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.165 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.165 2 INFO nova.compute.claims [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.279 2 DEBUG nova.scheduler.client.report [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.333 2 DEBUG nova.scheduler.client.report [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.334 2 DEBUG nova.compute.provider_tree [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.338 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpsq2beig9/a1be2bb3f6d244c48f8a49078c6d9b0f.delta 1073741824" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.340 2 INFO nova.virt.libvirt.driver [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.358 2 DEBUG nova.scheduler.client.report [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.389 2 DEBUG nova.scheduler.client.report [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.406 2 DEBUG nova.virt.libvirt.guest [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.496 2 DEBUG nova.compute.provider_tree [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.528 2 DEBUG nova.scheduler.client.report [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.572 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.573 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.661 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.662 2 DEBUG nova.network.neutron [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.688 2 INFO nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.707 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.909 2 DEBUG nova.virt.libvirt.guest [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:06:03 np0005466012 nova_compute[192063]: 2025-10-02 12:06:03.912 2 INFO nova.virt.libvirt.driver [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.034 2 DEBUG nova.privsep.utils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.034 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpsq2beig9/a1be2bb3f6d244c48f8a49078c6d9b0f.delta /var/lib/nova/instances/snapshots/tmpsq2beig9/a1be2bb3f6d244c48f8a49078c6d9b0f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.084 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.085 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.086 2 INFO nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating image(s)#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.086 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.087 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.087 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.099 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.168 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.170 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.172 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.196 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.235 2 DEBUG nova.policy [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.249 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.250 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.590 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk 1073741824" returned: 0 in 0.340s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.592 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.593 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.682 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.683 2 DEBUG nova.virt.disk.api [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Checking if we can resize image /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.684 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.735 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.736 2 DEBUG nova.virt.disk.api [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Cannot resize image /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:04 np0005466012 nova_compute[192063]: 2025-10-02 12:06:04.736 2 DEBUG nova.objects.instance [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.516 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.517 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Ensure instance console log exists: /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.518 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.519 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.519 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.529 2 DEBUG oslo_concurrency.processutils [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpsq2beig9/a1be2bb3f6d244c48f8a49078c6d9b0f.delta /var/lib/nova/instances/snapshots/tmpsq2beig9/a1be2bb3f6d244c48f8a49078c6d9b0f" returned: 0 in 1.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:05 np0005466012 nova_compute[192063]: 2025-10-02 12:06:05.530 2 INFO nova.virt.libvirt.driver [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:06:06 np0005466012 nova_compute[192063]: 2025-10-02 12:06:06.048 2 WARNING nova.compute.manager [None req-0d7243b5-7f22-4e53-97b8-f35a8b1be01e cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Image not found during snapshot: nova.exception.ImageNotFound: Image f77b6579-2012-4716-a1bf-e11ed39fefb3 could not be found.#033[00m
Oct  2 08:06:06 np0005466012 nova_compute[192063]: 2025-10-02 12:06:06.244 2 DEBUG nova.network.neutron [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Successfully created port: 68972cf3-172b-44a5-b096-f87fe9193518 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:06:06 np0005466012 nova_compute[192063]: 2025-10-02 12:06:06.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:08 np0005466012 podman[223779]: 2025-10-02 12:06:08.155611447 +0000 UTC m=+0.073990931 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=multipathd)
Oct  2 08:06:08 np0005466012 nova_compute[192063]: 2025-10-02 12:06:08.676 2 DEBUG nova.network.neutron [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Successfully updated port: 68972cf3-172b-44a5-b096-f87fe9193518 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.738 2 DEBUG nova.compute.manager [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-changed-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.739 2 DEBUG nova.compute.manager [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Refreshing instance network info cache due to event network-changed-68972cf3-172b-44a5-b096-f87fe9193518. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.739 2 DEBUG oslo_concurrency.lockutils [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.739 2 DEBUG oslo_concurrency.lockutils [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.739 2 DEBUG nova.network.neutron [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Refreshing network info cache for port 68972cf3-172b-44a5-b096-f87fe9193518 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.765 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "refresh_cache-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.780 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.781 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.781 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.781 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.782 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.803 2 INFO nova.compute.manager [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Terminating instance#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.819 2 DEBUG nova.compute.manager [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.849 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:09 np0005466012 kernel: tapdad72ac3-11 (unregistering): left promiscuous mode
Oct  2 08:06:09 np0005466012 NetworkManager[51207]: <info>  [1759406769.8706] device (tapdad72ac3-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:06:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:09Z|00096|binding|INFO|Releasing lport dad72ac3-1117-4c05-9056-6371bf8ee649 from this chassis (sb_readonly=0)
Oct  2 08:06:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:09Z|00097|binding|INFO|Setting lport dad72ac3-1117-4c05-9056-6371bf8ee649 down in Southbound
Oct  2 08:06:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:09Z|00098|binding|INFO|Removing iface tapdad72ac3-11 ovn-installed in OVS
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:09.893 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:c4:de 10.100.0.8'], port_security=['fa:16:3e:1d:c4:de 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3ef96c40-f041-4cfe-a0e0-26fe8e44f5af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-982b406e-0686-44db-8945-39e0f57e4781', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87e7399e976c40bc84f320ed0d052ac6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '410de2f3-62e2-482c-a480-7655c2811e48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f46c1e1f-04ef-471b-85c6-c4415ad3e6bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=dad72ac3-1117-4c05-9056-6371bf8ee649) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:09.895 103246 INFO neutron.agent.ovn.metadata.agent [-] Port dad72ac3-1117-4c05-9056-6371bf8ee649 in datapath 982b406e-0686-44db-8945-39e0f57e4781 unbound from our chassis#033[00m
Oct  2 08:06:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:09.897 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 982b406e-0686-44db-8945-39e0f57e4781, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:06:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:09.898 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd3d8ee-9a0f-4b39-81d6-ab51c70dd060]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:09.899 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 namespace which is not needed anymore#033[00m
Oct  2 08:06:09 np0005466012 nova_compute[192063]: 2025-10-02 12:06:09.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:09 np0005466012 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct  2 08:06:09 np0005466012 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001e.scope: Consumed 12.359s CPU time.
Oct  2 08:06:09 np0005466012 systemd-machined[152114]: Machine qemu-15-instance-0000001e terminated.
Oct  2 08:06:09 np0005466012 podman[223802]: 2025-10-02 12:06:09.977546153 +0000 UTC m=+0.082152868 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.049 2 DEBUG nova.network.neutron [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.074 2 INFO nova.virt.libvirt.driver [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Instance destroyed successfully.#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.074 2 DEBUG nova.objects.instance [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lazy-loading 'resources' on Instance uuid 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.088 2 DEBUG nova.virt.libvirt.vif [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:05:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1612577643',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1612577643',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1612577643',id=30,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:05:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87e7399e976c40bc84f320ed0d052ac6',ramdisk_id='',reservation_id='r-ekqyotg8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-507683469',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-507683469-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:06:06Z,user_data=None,user_id='cdc7ec1af4d8410db0b4592293549806',uuid=3ef96c40-f041-4cfe-a0e0-26fe8e44f5af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.089 2 DEBUG nova.network.os_vif_util [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converting VIF {"id": "dad72ac3-1117-4c05-9056-6371bf8ee649", "address": "fa:16:3e:1d:c4:de", "network": {"id": "982b406e-0686-44db-8945-39e0f57e4781", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-993027464-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "87e7399e976c40bc84f320ed0d052ac6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdad72ac3-11", "ovs_interfaceid": "dad72ac3-1117-4c05-9056-6371bf8ee649", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.089 2 DEBUG nova.network.os_vif_util [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.090 2 DEBUG os_vif [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdad72ac3-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.098 2 INFO os_vif [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:c4:de,bridge_name='br-int',has_traffic_filtering=True,id=dad72ac3-1117-4c05-9056-6371bf8ee649,network=Network(982b406e-0686-44db-8945-39e0f57e4781),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdad72ac3-11')#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.099 2 INFO nova.virt.libvirt.driver [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Deleting instance files /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af_del#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.099 2 INFO nova.virt.libvirt.driver [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Deletion of /var/lib/nova/instances/3ef96c40-f041-4cfe-a0e0-26fe8e44f5af_del complete#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.182 2 INFO nova.compute.manager [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.183 2 DEBUG oslo.service.loopingcall [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.184 2 DEBUG nova.compute.manager [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.184 2 DEBUG nova.network.neutron [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:06:10 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [NOTICE]   (223639) : haproxy version is 2.8.14-c23fe91
Oct  2 08:06:10 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [NOTICE]   (223639) : path to executable is /usr/sbin/haproxy
Oct  2 08:06:10 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [WARNING]  (223639) : Exiting Master process...
Oct  2 08:06:10 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [ALERT]    (223639) : Current worker (223641) exited with code 143 (Terminated)
Oct  2 08:06:10 np0005466012 neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781[223635]: [WARNING]  (223639) : All workers exited. Exiting... (0)
Oct  2 08:06:10 np0005466012 systemd[1]: libpod-2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021.scope: Deactivated successfully.
Oct  2 08:06:10 np0005466012 podman[223846]: 2025-10-02 12:06:10.278908973 +0000 UTC m=+0.272120828 container died 2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:06:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021-userdata-shm.mount: Deactivated successfully.
Oct  2 08:06:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay-44993b66c02431e768a736fb1d55e7e6e772948e3a2e1bd8320a33b4f991f28d-merged.mount: Deactivated successfully.
Oct  2 08:06:10 np0005466012 podman[223846]: 2025-10-02 12:06:10.743547081 +0000 UTC m=+0.736758946 container cleanup 2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:06:10 np0005466012 systemd[1]: libpod-conmon-2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021.scope: Deactivated successfully.
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:10 np0005466012 nova_compute[192063]: 2025-10-02 12:06:10.986 2 DEBUG nova.network.neutron [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:11 np0005466012 podman[223894]: 2025-10-02 12:06:11.029854387 +0000 UTC m=+0.253506271 container remove 2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.036 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c143c029-1ae7-4402-b1ea-ad17eb94fe33]: (4, ('Thu Oct  2 12:06:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 (2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021)\n2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021\nThu Oct  2 12:06:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 (2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021)\n2df248381b878603946560c1bf5d4b84ed42e4ad14426d78a4876a98eaeed021\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.038 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8f83e4b3-2989-476f-a70f-b735c9894e2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 kernel: tap982b406e-00: left promiscuous mode
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.039 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap982b406e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:11 np0005466012 nova_compute[192063]: 2025-10-02 12:06:11.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:11 np0005466012 nova_compute[192063]: 2025-10-02 12:06:11.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.070 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f273b89c-819a-4089-9e15-7345a7ad67e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.098 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[708d6ccd-5954-4106-8ec6-8be5ba25eab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.099 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0421a953-78de-46ee-8875-2a49efb76567]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.118 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d0496cc9-b804-4fff-b58b-6e091668a169]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474653, 'reachable_time': 33038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223910, 'error': None, 'target': 'ovnmeta-982b406e-0686-44db-8945-39e0f57e4781', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.121 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-982b406e-0686-44db-8945-39e0f57e4781 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:06:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:11.121 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4d9a6a-1ef0-4763-9613-d8fce6a7427f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:11 np0005466012 systemd[1]: run-netns-ovnmeta\x2d982b406e\x2d0686\x2d44db\x2d8945\x2d39e0f57e4781.mount: Deactivated successfully.
Oct  2 08:06:11 np0005466012 nova_compute[192063]: 2025-10-02 12:06:11.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:11 np0005466012 nova_compute[192063]: 2025-10-02 12:06:11.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:12 np0005466012 podman[223911]: 2025-10-02 12:06:12.138558419 +0000 UTC m=+0.057920261 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:06:12 np0005466012 podman[223912]: 2025-10-02 12:06:12.142585706 +0000 UTC m=+0.055854875 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:06:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:12.366 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:12.367 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:12.752 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:12.753 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.759 2 DEBUG oslo_concurrency.lockutils [req-eba2617c-ec3c-4f06-a8fc-a5fa3003391a req-b9ca506a-05c2-4f69-b893-d4042174a611 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.760 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquired lock "refresh_cache-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.760 2 DEBUG nova.network.neutron [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:12 np0005466012 nova_compute[192063]: 2025-10-02 12:06:12.850 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.294 2 DEBUG nova.network.neutron [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.516 2 DEBUG nova.network.neutron [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.531 2 DEBUG nova.compute.manager [req-7490e173-893b-4f0f-bbbf-f76cf6b3e76c req-c732730d-abe9-4d00-a112-505d786f6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-vif-unplugged-dad72ac3-1117-4c05-9056-6371bf8ee649 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.531 2 DEBUG oslo_concurrency.lockutils [req-7490e173-893b-4f0f-bbbf-f76cf6b3e76c req-c732730d-abe9-4d00-a112-505d786f6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.532 2 DEBUG oslo_concurrency.lockutils [req-7490e173-893b-4f0f-bbbf-f76cf6b3e76c req-c732730d-abe9-4d00-a112-505d786f6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.532 2 DEBUG oslo_concurrency.lockutils [req-7490e173-893b-4f0f-bbbf-f76cf6b3e76c req-c732730d-abe9-4d00-a112-505d786f6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.532 2 DEBUG nova.compute.manager [req-7490e173-893b-4f0f-bbbf-f76cf6b3e76c req-c732730d-abe9-4d00-a112-505d786f6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] No waiting events found dispatching network-vif-unplugged-dad72ac3-1117-4c05-9056-6371bf8ee649 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.533 2 DEBUG nova.compute.manager [req-7490e173-893b-4f0f-bbbf-f76cf6b3e76c req-c732730d-abe9-4d00-a112-505d786f6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-vif-unplugged-dad72ac3-1117-4c05-9056-6371bf8ee649 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.549 2 INFO nova.compute.manager [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Took 3.36 seconds to deallocate network for instance.#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.641 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.641 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.745 2 DEBUG nova.compute.provider_tree [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.761 2 DEBUG nova.scheduler.client.report [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.789 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.817 2 INFO nova.scheduler.client.report [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Deleted allocations for instance 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.846 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.848 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:06:13 np0005466012 nova_compute[192063]: 2025-10-02 12:06:13.926 2 DEBUG oslo_concurrency.lockutils [None req-5b999ea9-5cf0-443f-bf5a-8928379d24c9 cdc7ec1af4d8410db0b4592293549806 87e7399e976c40bc84f320ed0d052ac6 - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.023 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.024 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5739MB free_disk=73.46610641479492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.024 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.024 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.093 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.093 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.093 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.162 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.176 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.205 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.205 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.639 2 DEBUG nova.network.neutron [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Updating instance_info_cache with network_info: [{"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.662 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Releasing lock "refresh_cache-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.663 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance network_info: |[{"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.666 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start _get_guest_xml network_info=[{"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.671 2 WARNING nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.681 2 DEBUG nova.virt.libvirt.host [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.681 2 DEBUG nova.virt.libvirt.host [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.789 2 DEBUG nova.virt.libvirt.host [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.790 2 DEBUG nova.virt.libvirt.host [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.791 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.791 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.792 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.792 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.792 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.792 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.793 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.793 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.793 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.793 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.793 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.794 2 DEBUG nova.virt.hardware [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.797 2 DEBUG nova.virt.libvirt.vif [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:03Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.797 2 DEBUG nova.network.os_vif_util [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.797 2 DEBUG nova.network.os_vif_util [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.798 2 DEBUG nova.objects.instance [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.813 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <uuid>4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</uuid>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <name>instance-00000021</name>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersAdminTestJSON-server-1814602736</nova:name>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:06:14</nova:creationTime>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:user uuid="9258efa4511c4bb3813eca27b75b1008">tempest-ServersAdminTestJSON-1782354187-project-member</nova:user>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:project uuid="db3f04a20fd740c1af3139196dc928d2">tempest-ServersAdminTestJSON-1782354187</nova:project>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        <nova:port uuid="68972cf3-172b-44a5-b096-f87fe9193518">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <entry name="serial">4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</entry>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <entry name="uuid">4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</entry>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:da:05:83"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <target dev="tap68972cf3-17"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/console.log" append="off"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:06:14 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:06:14 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:06:14 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:06:14 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.814 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Preparing to wait for external event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.814 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.815 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.815 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.815 2 DEBUG nova.virt.libvirt.vif [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:03Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.816 2 DEBUG nova.network.os_vif_util [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.816 2 DEBUG nova.network.os_vif_util [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.816 2 DEBUG os_vif [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.817 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.817 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.820 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68972cf3-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.821 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68972cf3-17, col_values=(('external_ids', {'iface-id': '68972cf3-172b-44a5-b096-f87fe9193518', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:05:83', 'vm-uuid': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:14 np0005466012 NetworkManager[51207]: <info>  [1759406774.8229] manager: (tap68972cf3-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:14 np0005466012 nova_compute[192063]: 2025-10-02 12:06:14.829 2 INFO os_vif [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17')#033[00m
Oct  2 08:06:15 np0005466012 nova_compute[192063]: 2025-10-02 12:06:15.018 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:15 np0005466012 nova_compute[192063]: 2025-10-02 12:06:15.018 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:15 np0005466012 nova_compute[192063]: 2025-10-02 12:06:15.018 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No VIF found with MAC fa:16:3e:da:05:83, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:06:15 np0005466012 nova_compute[192063]: 2025-10-02 12:06:15.019 2 INFO nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Using config drive#033[00m
Oct  2 08:06:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:15.756 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.210 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.470 2 INFO nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating config drive at /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.475 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpywkmrz3c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.620 2 DEBUG oslo_concurrency.processutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpywkmrz3c" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:16 np0005466012 kernel: tap68972cf3-17: entered promiscuous mode
Oct  2 08:06:16 np0005466012 NetworkManager[51207]: <info>  [1759406776.6941] manager: (tap68972cf3-17): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Oct  2 08:06:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:16Z|00099|binding|INFO|Claiming lport 68972cf3-172b-44a5-b096-f87fe9193518 for this chassis.
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:16Z|00100|binding|INFO|68972cf3-172b-44a5-b096-f87fe9193518: Claiming fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.712 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:05:83 10.100.0.7'], port_security=['fa:16:3e:da:05:83 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=68972cf3-172b-44a5-b096-f87fe9193518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.712 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 68972cf3-172b-44a5-b096-f87fe9193518 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e bound to our chassis#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.714 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:06:16 np0005466012 systemd-udevd[223978]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:16 np0005466012 systemd-machined[152114]: New machine qemu-16-instance-00000021.
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.724 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2b02be70-3abc-443d-87f3-b43e74ed2d3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.725 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66b5a7c3-f1 in ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.727 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66b5a7c3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.727 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e74f368c-872a-4115-a7c1-759b44df194e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.729 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4387ca4f-773b-43f5-806f-76dd1a4a1d1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 NetworkManager[51207]: <info>  [1759406776.7318] device (tap68972cf3-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:06:16 np0005466012 NetworkManager[51207]: <info>  [1759406776.7328] device (tap68972cf3-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.740 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[75015e6f-1f8f-40d9-8e8c-1746bfba3f57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 systemd[1]: Started Virtual Machine qemu-16-instance-00000021.
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:16Z|00101|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 ovn-installed in OVS
Oct  2 08:06:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:16Z|00102|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 up in Southbound
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.764 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[485e8103-b378-4457-81fd-d0f8905a9e32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.790 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[091fd3fc-49ad-47b3-8b18-8355eeff5f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.795 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[721ea20d-f24f-4aa5-9310-f980bf50370c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 systemd-udevd[223984]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:16 np0005466012 NetworkManager[51207]: <info>  [1759406776.7968] manager: (tap66b5a7c3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.833 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[255aa7e8-d563-401b-b1a0-06b236c93261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.836 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[357e164d-2500-4b95-b5c4-f8e387364202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 NetworkManager[51207]: <info>  [1759406776.8596] device (tap66b5a7c3-f0): carrier: link connected
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.868 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1a742728-a454-47a4-b983-9641ffd1d379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.887 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[de5b0329-1afe-4d6f-8e05-177ab8c3bd31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224014, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.903 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[36c12883-1668-430c-bd87-a34ba39d2676]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:7b77'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477047, 'tstamp': 477047}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224015, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.922 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[55258579-3012-4789-bd1b-1c713e5b0810]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224016, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:16.957 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbf05d6-4668-4b08-9ba9-9b6c7a6aa81a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.983 2 DEBUG nova.compute.manager [req-bfaf753c-3762-47a7-aded-38f9aa61bcda req-00280bd6-6ab3-41c0-ada1-f6a8133a2377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.983 2 DEBUG oslo_concurrency.lockutils [req-bfaf753c-3762-47a7-aded-38f9aa61bcda req-00280bd6-6ab3-41c0-ada1-f6a8133a2377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.983 2 DEBUG oslo_concurrency.lockutils [req-bfaf753c-3762-47a7-aded-38f9aa61bcda req-00280bd6-6ab3-41c0-ada1-f6a8133a2377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.984 2 DEBUG oslo_concurrency.lockutils [req-bfaf753c-3762-47a7-aded-38f9aa61bcda req-00280bd6-6ab3-41c0-ada1-f6a8133a2377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "3ef96c40-f041-4cfe-a0e0-26fe8e44f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.984 2 DEBUG nova.compute.manager [req-bfaf753c-3762-47a7-aded-38f9aa61bcda req-00280bd6-6ab3-41c0-ada1-f6a8133a2377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] No waiting events found dispatching network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:16 np0005466012 nova_compute[192063]: 2025-10-02 12:06:16.984 2 WARNING nova.compute.manager [req-bfaf753c-3762-47a7-aded-38f9aa61bcda req-00280bd6-6ab3-41c0-ada1-f6a8133a2377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received unexpected event network-vif-plugged-dad72ac3-1117-4c05-9056-6371bf8ee649 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.027 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcd9088-106f-4672-b951-bee40bd2d6ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.030 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.030 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.030 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005466012 kernel: tap66b5a7c3-f0: entered promiscuous mode
Oct  2 08:06:17 np0005466012 NetworkManager[51207]: <info>  [1759406777.0339] manager: (tap66b5a7c3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.036 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:17Z|00103|binding|INFO|Releasing lport a0163170-212d-4aba-9028-3d5fb4d45c5b from this chassis (sb_readonly=0)
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.038 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.039 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1ffda679-4c73-4fed-bea4-0fe9fb1d39f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.040 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.pid.haproxy
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:06:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:17.040 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'env', 'PROCESS_TAG=haproxy-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66b5a7c3-fe3e-42b0-aea6-19534bca6e0e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.180 2 DEBUG nova.compute.manager [req-46c148e5-98c9-469f-8d9c-d8ce816f9a70 req-8f4a7469-813f-453a-a639-3790fb2c6aef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.181 2 DEBUG oslo_concurrency.lockutils [req-46c148e5-98c9-469f-8d9c-d8ce816f9a70 req-8f4a7469-813f-453a-a639-3790fb2c6aef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.181 2 DEBUG oslo_concurrency.lockutils [req-46c148e5-98c9-469f-8d9c-d8ce816f9a70 req-8f4a7469-813f-453a-a639-3790fb2c6aef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.181 2 DEBUG oslo_concurrency.lockutils [req-46c148e5-98c9-469f-8d9c-d8ce816f9a70 req-8f4a7469-813f-453a-a639-3790fb2c6aef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.181 2 DEBUG nova.compute.manager [req-46c148e5-98c9-469f-8d9c-d8ce816f9a70 req-8f4a7469-813f-453a-a639-3790fb2c6aef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Processing event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.257 2 DEBUG nova.compute.manager [req-2afde44d-6350-46a8-b33e-2ff0998d1d69 req-1ad65517-a2ea-48cf-8f81-7997f14e6275 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Received event network-vif-deleted-dad72ac3-1117-4c05-9056-6371bf8ee649 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:17 np0005466012 podman[224056]: 2025-10-02 12:06:17.427012648 +0000 UTC m=+0.075199482 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.652 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406777.6524198, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.653 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.655 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.659 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.662 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance spawned successfully.#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.662 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:17 np0005466012 podman[224056]: 2025-10-02 12:06:17.675133514 +0000 UTC m=+0.323320318 container create a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.687 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.690 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.696 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.696 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.697 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.697 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.697 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.697 2 DEBUG nova.virt.libvirt.driver [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.770 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.770 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406777.6527116, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.771 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.814 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.818 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406777.658381, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.819 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:06:17 np0005466012 systemd[1]: Started libpod-conmon-a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0.scope.
Oct  2 08:06:17 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:06:17 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f47385ad72212b6b2eab994461dcb053e7b499e47fd6c36349c32c4eaca8e7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.903 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.903 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.913 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.915 2 INFO nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Took 13.83 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.915 2 DEBUG nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.919 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:17 np0005466012 nova_compute[192063]: 2025-10-02 12:06:17.963 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:17 np0005466012 podman[224056]: 2025-10-02 12:06:17.975092726 +0000 UTC m=+0.623279580 container init a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:06:17 np0005466012 podman[224056]: 2025-10-02 12:06:17.982372192 +0000 UTC m=+0.630559016 container start a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:06:18 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [NOTICE]   (224076) : New worker (224078) forked
Oct  2 08:06:18 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [NOTICE]   (224076) : Loading success.
Oct  2 08:06:18 np0005466012 nova_compute[192063]: 2025-10-02 12:06:18.035 2 INFO nova.compute.manager [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Took 14.96 seconds to build instance.#033[00m
Oct  2 08:06:18 np0005466012 nova_compute[192063]: 2025-10-02 12:06:18.073 2 DEBUG oslo_concurrency.lockutils [None req-7e59694d-29fa-453b-8515-0ba97b54c392 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.315 2 DEBUG nova.compute.manager [req-72741d72-034f-401d-836b-b72079cf3029 req-4dd45b2d-e932-4191-8b1f-596cbbe3020b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.315 2 DEBUG oslo_concurrency.lockutils [req-72741d72-034f-401d-836b-b72079cf3029 req-4dd45b2d-e932-4191-8b1f-596cbbe3020b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.316 2 DEBUG oslo_concurrency.lockutils [req-72741d72-034f-401d-836b-b72079cf3029 req-4dd45b2d-e932-4191-8b1f-596cbbe3020b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.316 2 DEBUG oslo_concurrency.lockutils [req-72741d72-034f-401d-836b-b72079cf3029 req-4dd45b2d-e932-4191-8b1f-596cbbe3020b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.316 2 DEBUG nova.compute.manager [req-72741d72-034f-401d-836b-b72079cf3029 req-4dd45b2d-e932-4191-8b1f-596cbbe3020b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.316 2 WARNING nova.compute.manager [req-72741d72-034f-401d-836b-b72079cf3029 req-4dd45b2d-e932-4191-8b1f-596cbbe3020b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:19 np0005466012 nova_compute[192063]: 2025-10-02 12:06:19.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:20.368 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:24 np0005466012 nova_compute[192063]: 2025-10-02 12:06:24.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:24 np0005466012 nova_compute[192063]: 2025-10-02 12:06:24.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:25 np0005466012 nova_compute[192063]: 2025-10-02 12:06:25.073 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406770.072149, 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:25 np0005466012 nova_compute[192063]: 2025-10-02 12:06:25.074 2 INFO nova.compute.manager [-] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:06:25 np0005466012 nova_compute[192063]: 2025-10-02 12:06:25.112 2 DEBUG nova.compute.manager [None req-6f693266-3c76-49e3-afbf-64d4fda0e198 - - - - - -] [instance: 3ef96c40-f041-4cfe-a0e0-26fe8e44f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:26 np0005466012 podman[224092]: 2025-10-02 12:06:26.138849422 +0000 UTC m=+0.058703371 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:06:26 np0005466012 podman[224093]: 2025-10-02 12:06:26.217582728 +0000 UTC m=+0.125561738 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.353 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.354 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.383 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.525 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.527 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.534 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.534 2 INFO nova.compute.claims [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.750 2 DEBUG nova.compute.provider_tree [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.764 2 DEBUG nova.scheduler.client.report [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.794 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.795 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.903 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.904 2 DEBUG nova.network.neutron [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.932 2 INFO nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:27 np0005466012 nova_compute[192063]: 2025-10-02 12:06:27.953 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.148 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.149 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.150 2 INFO nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Creating image(s)#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.150 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "/var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.151 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.152 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.167 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.245 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.246 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.248 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.262 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.328 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.329 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.422 2 DEBUG nova.policy [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.520 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk 1073741824" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.522 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.522 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.582 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.583 2 DEBUG nova.virt.disk.api [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Checking if we can resize image /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.584 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.641 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.642 2 DEBUG nova.virt.disk.api [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Cannot resize image /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.642 2 DEBUG nova.objects.instance [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'migration_context' on Instance uuid 661e118d-4849-4ccd-a03f-a0edb70948ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.670 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.670 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Ensure instance console log exists: /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.671 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.671 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:28 np0005466012 nova_compute[192063]: 2025-10-02 12:06:28.671 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:29 np0005466012 nova_compute[192063]: 2025-10-02 12:06:29.617 2 DEBUG nova.network.neutron [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Successfully created port: bfbbaed4-8dd4-4ee3-bc31-049983ebccab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:06:29 np0005466012 nova_compute[192063]: 2025-10-02 12:06:29.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:29 np0005466012 nova_compute[192063]: 2025-10-02 12:06:29.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:30 np0005466012 podman[224171]: 2025-10-02 12:06:30.147472731 +0000 UTC m=+0.057911609 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:30Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:06:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:30Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:06:30 np0005466012 nova_compute[192063]: 2025-10-02 12:06:30.933 2 DEBUG nova.network.neutron [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Successfully updated port: bfbbaed4-8dd4-4ee3-bc31-049983ebccab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:06:30 np0005466012 nova_compute[192063]: 2025-10-02 12:06:30.977 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:30 np0005466012 nova_compute[192063]: 2025-10-02 12:06:30.977 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquired lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:30 np0005466012 nova_compute[192063]: 2025-10-02 12:06:30.978 2 DEBUG nova.network.neutron [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:06:31 np0005466012 nova_compute[192063]: 2025-10-02 12:06:31.234 2 DEBUG nova.compute.manager [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-changed-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:31 np0005466012 nova_compute[192063]: 2025-10-02 12:06:31.234 2 DEBUG nova.compute.manager [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Refreshing instance network info cache due to event network-changed-bfbbaed4-8dd4-4ee3-bc31-049983ebccab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:06:31 np0005466012 nova_compute[192063]: 2025-10-02 12:06:31.235 2 DEBUG oslo_concurrency.lockutils [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:06:31 np0005466012 nova_compute[192063]: 2025-10-02 12:06:31.483 2 DEBUG nova.network.neutron [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:06:33 np0005466012 podman[224192]: 2025-10-02 12:06:33.166506175 +0000 UTC m=+0.071404311 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.415 2 DEBUG nova.network.neutron [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Updating instance_info_cache with network_info: [{"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.435 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Releasing lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.435 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Instance network_info: |[{"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.436 2 DEBUG oslo_concurrency.lockutils [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.436 2 DEBUG nova.network.neutron [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Refreshing network info cache for port bfbbaed4-8dd4-4ee3-bc31-049983ebccab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.439 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Start _get_guest_xml network_info=[{"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.443 2 WARNING nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.556 2 DEBUG nova.virt.libvirt.host [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.557 2 DEBUG nova.virt.libvirt.host [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.561 2 DEBUG nova.virt.libvirt.host [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.562 2 DEBUG nova.virt.libvirt.host [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.563 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.563 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.564 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.564 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.564 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.564 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.565 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.565 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.565 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.566 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.566 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.566 2 DEBUG nova.virt.hardware [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.569 2 DEBUG nova.virt.libvirt.vif [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-64695182',display_name='tempest-ServersAdminTestJSON-server-64695182',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-64695182',id=36,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-kv2dupf6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:28Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=661e118d-4849-4ccd-a03f-a0edb70948ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.570 2 DEBUG nova.network.os_vif_util [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.570 2 DEBUG nova.network.os_vif_util [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.571 2 DEBUG nova.objects.instance [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 661e118d-4849-4ccd-a03f-a0edb70948ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.585 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <uuid>661e118d-4849-4ccd-a03f-a0edb70948ea</uuid>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <name>instance-00000024</name>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersAdminTestJSON-server-64695182</nova:name>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:06:34</nova:creationTime>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:user uuid="9258efa4511c4bb3813eca27b75b1008">tempest-ServersAdminTestJSON-1782354187-project-member</nova:user>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:project uuid="db3f04a20fd740c1af3139196dc928d2">tempest-ServersAdminTestJSON-1782354187</nova:project>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        <nova:port uuid="bfbbaed4-8dd4-4ee3-bc31-049983ebccab">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <entry name="serial">661e118d-4849-4ccd-a03f-a0edb70948ea</entry>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <entry name="uuid">661e118d-4849-4ccd-a03f-a0edb70948ea</entry>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.config"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:e1:11:35"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <target dev="tapbfbbaed4-8d"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/console.log" append="off"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:06:34 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:06:34 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:06:34 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:06:34 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.587 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Preparing to wait for external event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.587 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.587 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.588 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.588 2 DEBUG nova.virt.libvirt.vif [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-64695182',display_name='tempest-ServersAdminTestJSON-server-64695182',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-64695182',id=36,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-kv2dupf6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:28Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=661e118d-4849-4ccd-a03f-a0edb70948ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.588 2 DEBUG nova.network.os_vif_util [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.589 2 DEBUG nova.network.os_vif_util [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.589 2 DEBUG os_vif [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.590 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.590 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.594 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfbbaed4-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.594 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfbbaed4-8d, col_values=(('external_ids', {'iface-id': 'bfbbaed4-8dd4-4ee3-bc31-049983ebccab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:11:35', 'vm-uuid': '661e118d-4849-4ccd-a03f-a0edb70948ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:34 np0005466012 NetworkManager[51207]: <info>  [1759406794.5971] manager: (tapbfbbaed4-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.604 2 INFO os_vif [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d')#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.665 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.665 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.665 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No VIF found with MAC fa:16:3e:e1:11:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.666 2 INFO nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Using config drive#033[00m
Oct  2 08:06:34 np0005466012 nova_compute[192063]: 2025-10-02 12:06:34.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:35 np0005466012 nova_compute[192063]: 2025-10-02 12:06:35.731 2 INFO nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Creating config drive at /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.config#033[00m
Oct  2 08:06:35 np0005466012 nova_compute[192063]: 2025-10-02 12:06:35.738 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9apb978p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:35 np0005466012 nova_compute[192063]: 2025-10-02 12:06:35.864 2 DEBUG oslo_concurrency.processutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9apb978p" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:35 np0005466012 NetworkManager[51207]: <info>  [1759406795.9167] manager: (tapbfbbaed4-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Oct  2 08:06:35 np0005466012 kernel: tapbfbbaed4-8d: entered promiscuous mode
Oct  2 08:06:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:35Z|00104|binding|INFO|Claiming lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab for this chassis.
Oct  2 08:06:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:35Z|00105|binding|INFO|bfbbaed4-8dd4-4ee3-bc31-049983ebccab: Claiming fa:16:3e:e1:11:35 10.100.0.4
Oct  2 08:06:35 np0005466012 nova_compute[192063]: 2025-10-02 12:06:35.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:35.925 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:11:35 10.100.0.4'], port_security=['fa:16:3e:e1:11:35 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=bfbbaed4-8dd4-4ee3-bc31-049983ebccab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:35.927 103246 INFO neutron.agent.ovn.metadata.agent [-] Port bfbbaed4-8dd4-4ee3-bc31-049983ebccab in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e bound to our chassis#033[00m
Oct  2 08:06:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:35.929 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:06:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:35Z|00106|binding|INFO|Setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab ovn-installed in OVS
Oct  2 08:06:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:35Z|00107|binding|INFO|Setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab up in Southbound
Oct  2 08:06:35 np0005466012 nova_compute[192063]: 2025-10-02 12:06:35.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:35 np0005466012 nova_compute[192063]: 2025-10-02 12:06:35.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:35.944 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1ce16c-8379-470f-9be1-9dc0cc04dc67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:35 np0005466012 systemd-udevd[224234]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:06:35 np0005466012 systemd-machined[152114]: New machine qemu-17-instance-00000024.
Oct  2 08:06:35 np0005466012 NetworkManager[51207]: <info>  [1759406795.9593] device (tapbfbbaed4-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:06:35 np0005466012 NetworkManager[51207]: <info>  [1759406795.9607] device (tapbfbbaed4-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:06:35 np0005466012 systemd[1]: Started Virtual Machine qemu-17-instance-00000024.
Oct  2 08:06:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:35.975 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3e324093-45b6-4894-8ace-2e9f0a494493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:35.978 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0f2821-8c67-4c7f-9e87-f1241b497968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.006 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6f5157-381a-4316-a019-0e9cdb8c9ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.024 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[01b946ff-d840-49ea-a9d0-78cf371d677c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 874, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 874, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224244, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.039 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc69a65-6218-4693-af25-bdcbce85c20e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224248, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224248, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.041 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.075 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.076 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.076 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:36.077 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.260 2 DEBUG nova.compute.manager [req-40156f20-0e37-40d9-be18-67f802ea28d9 req-1d08ee4a-a066-4628-a7d3-8af38ab20b49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.261 2 DEBUG oslo_concurrency.lockutils [req-40156f20-0e37-40d9-be18-67f802ea28d9 req-1d08ee4a-a066-4628-a7d3-8af38ab20b49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.261 2 DEBUG oslo_concurrency.lockutils [req-40156f20-0e37-40d9-be18-67f802ea28d9 req-1d08ee4a-a066-4628-a7d3-8af38ab20b49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.262 2 DEBUG oslo_concurrency.lockutils [req-40156f20-0e37-40d9-be18-67f802ea28d9 req-1d08ee4a-a066-4628-a7d3-8af38ab20b49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.262 2 DEBUG nova.compute.manager [req-40156f20-0e37-40d9-be18-67f802ea28d9 req-1d08ee4a-a066-4628-a7d3-8af38ab20b49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Processing event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.697 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406796.6971185, 661e118d-4849-4ccd-a03f-a0edb70948ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.697 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] VM Started (Lifecycle Event)#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.699 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.704 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.707 2 INFO nova.virt.libvirt.driver [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Instance spawned successfully.#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.707 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.738 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.741 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.741 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.742 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.742 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.742 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.743 2 DEBUG nova.virt.libvirt.driver [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.746 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.804 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.805 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406796.6972544, 661e118d-4849-4ccd-a03f-a0edb70948ea => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.805 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.841 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.844 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406796.7018507, 661e118d-4849-4ccd-a03f-a0edb70948ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.844 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.872 2 INFO nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Took 8.72 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.873 2 DEBUG nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.914 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.916 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.952 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:06:36 np0005466012 nova_compute[192063]: 2025-10-02 12:06:36.984 2 INFO nova.compute.manager [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Took 9.53 seconds to build instance.#033[00m
Oct  2 08:06:37 np0005466012 nova_compute[192063]: 2025-10-02 12:06:37.002 2 DEBUG oslo_concurrency.lockutils [None req-1aceb28a-c842-495e-8ade-5015d732fd91 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:37 np0005466012 nova_compute[192063]: 2025-10-02 12:06:37.152 2 DEBUG nova.network.neutron [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Updated VIF entry in instance network info cache for port bfbbaed4-8dd4-4ee3-bc31-049983ebccab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:06:37 np0005466012 nova_compute[192063]: 2025-10-02 12:06:37.152 2 DEBUG nova.network.neutron [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Updating instance_info_cache with network_info: [{"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:06:37 np0005466012 nova_compute[192063]: 2025-10-02 12:06:37.175 2 DEBUG oslo_concurrency.lockutils [req-53d4012e-3557-41c3-9fc6-0e21fd695f4f req-eaddf261-3193-4de5-89c0-9d482d814f93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:06:38 np0005466012 nova_compute[192063]: 2025-10-02 12:06:38.518 2 DEBUG nova.compute.manager [req-b5d036c2-c756-494b-8b5d-f9bc2f57b2c8 req-800115f6-7e1e-4dcb-85e1-f7d884e9bd1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:38 np0005466012 nova_compute[192063]: 2025-10-02 12:06:38.519 2 DEBUG oslo_concurrency.lockutils [req-b5d036c2-c756-494b-8b5d-f9bc2f57b2c8 req-800115f6-7e1e-4dcb-85e1-f7d884e9bd1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:38 np0005466012 nova_compute[192063]: 2025-10-02 12:06:38.519 2 DEBUG oslo_concurrency.lockutils [req-b5d036c2-c756-494b-8b5d-f9bc2f57b2c8 req-800115f6-7e1e-4dcb-85e1-f7d884e9bd1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:38 np0005466012 nova_compute[192063]: 2025-10-02 12:06:38.520 2 DEBUG oslo_concurrency.lockutils [req-b5d036c2-c756-494b-8b5d-f9bc2f57b2c8 req-800115f6-7e1e-4dcb-85e1-f7d884e9bd1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:38 np0005466012 nova_compute[192063]: 2025-10-02 12:06:38.520 2 DEBUG nova.compute.manager [req-b5d036c2-c756-494b-8b5d-f9bc2f57b2c8 req-800115f6-7e1e-4dcb-85e1-f7d884e9bd1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] No waiting events found dispatching network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:38 np0005466012 nova_compute[192063]: 2025-10-02 12:06:38.520 2 WARNING nova.compute.manager [req-b5d036c2-c756-494b-8b5d-f9bc2f57b2c8 req-800115f6-7e1e-4dcb-85e1-f7d884e9bd1f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received unexpected event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab for instance with vm_state active and task_state None.#033[00m
Oct  2 08:06:39 np0005466012 podman[224257]: 2025-10-02 12:06:39.15259555 +0000 UTC m=+0.065479112 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.build-date=20251001)
Oct  2 08:06:39 np0005466012 nova_compute[192063]: 2025-10-02 12:06:39.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:39 np0005466012 nova_compute[192063]: 2025-10-02 12:06:39.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:40 np0005466012 podman[224277]: 2025-10-02 12:06:40.137286275 +0000 UTC m=+0.049600487 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, architecture=x86_64)
Oct  2 08:06:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:41Z|00108|binding|INFO|Releasing lport a0163170-212d-4aba-9028-3d5fb4d45c5b from this chassis (sb_readonly=0)
Oct  2 08:06:41 np0005466012 nova_compute[192063]: 2025-10-02 12:06:41.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:43 np0005466012 podman[224299]: 2025-10-02 12:06:43.138481491 +0000 UTC m=+0.056200805 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:06:43 np0005466012 podman[224298]: 2025-10-02 12:06:43.166986204 +0000 UTC m=+0.082067437 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:06:44 np0005466012 nova_compute[192063]: 2025-10-02 12:06:44.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:44 np0005466012 nova_compute[192063]: 2025-10-02 12:06:44.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:48Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e1:11:35 10.100.0.4
Oct  2 08:06:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:48Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e1:11:35 10.100.0.4
Oct  2 08:06:49 np0005466012 nova_compute[192063]: 2025-10-02 12:06:49.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:49 np0005466012 nova_compute[192063]: 2025-10-02 12:06:49.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:54 np0005466012 nova_compute[192063]: 2025-10-02 12:06:54.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:54 np0005466012 nova_compute[192063]: 2025-10-02 12:06:54.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.414 2 INFO nova.compute.manager [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Rebuilding instance#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.797 2 DEBUG nova.compute.manager [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.873 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.891 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.905 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'resources' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.917 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.930 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:06:56 np0005466012 nova_compute[192063]: 2025-10-02 12:06:56.935 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:06:57 np0005466012 podman[224349]: 2025-10-02 12:06:57.139486701 +0000 UTC m=+0.058987448 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:06:57 np0005466012 podman[224350]: 2025-10-02 12:06:57.181894806 +0000 UTC m=+0.098768643 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.031 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "365f39ba-80db-4de0-ad55-45b007ea1c04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.031 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "365f39ba-80db-4de0-ad55-45b007ea1c04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.050 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:06:59 np0005466012 kernel: tap68972cf3-17 (unregistering): left promiscuous mode
Oct  2 08:06:59 np0005466012 NetworkManager[51207]: <info>  [1759406819.0890] device (tap68972cf3-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:59Z|00109|binding|INFO|Releasing lport 68972cf3-172b-44a5-b096-f87fe9193518 from this chassis (sb_readonly=0)
Oct  2 08:06:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:59Z|00110|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 down in Southbound
Oct  2 08:06:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:06:59Z|00111|binding|INFO|Removing iface tap68972cf3-17 ovn-installed in OVS
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.112 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:05:83 10.100.0.7'], port_security=['fa:16:3e:da:05:83 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=68972cf3-172b-44a5-b096-f87fe9193518) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.114 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 68972cf3-172b-44a5-b096-f87fe9193518 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e unbound from our chassis#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.115 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.133 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1b14a3ed-546a-4c6b-837e-2e50b017b065]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.137 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.137 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.144 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.144 2 INFO nova.compute.claims [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.165 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f946bf6e-c7f6-4988-b27e-66218ee7988e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.168 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[a48445dc-883f-4c22-ad9f-3ca1aa88dcbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:59 np0005466012 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000021.scope: Deactivated successfully.
Oct  2 08:06:59 np0005466012 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000021.scope: Consumed 14.609s CPU time.
Oct  2 08:06:59 np0005466012 systemd-machined[152114]: Machine qemu-16-instance-00000021 terminated.
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.198 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[dbba16e1-2d9f-4c7f-aedb-2f6223a8511b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.217 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[21eaf420-8182-4cf9-b081-abd23629d860]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224409, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.232 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[060430f1-fb1e-4d76-81b9-51b0cec329f3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224410, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224410, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.234 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.241 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.241 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.242 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:06:59.242 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.317 2 DEBUG nova.compute.provider_tree [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.350 2 DEBUG nova.scheduler.client.report [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.379 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.380 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.395 2 DEBUG nova.compute.manager [req-61cf4510-3685-4620-8884-5761f03745b6 req-30613b00-9e2a-411b-8aa2-e08113316463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.395 2 DEBUG oslo_concurrency.lockutils [req-61cf4510-3685-4620-8884-5761f03745b6 req-30613b00-9e2a-411b-8aa2-e08113316463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.395 2 DEBUG oslo_concurrency.lockutils [req-61cf4510-3685-4620-8884-5761f03745b6 req-30613b00-9e2a-411b-8aa2-e08113316463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.396 2 DEBUG oslo_concurrency.lockutils [req-61cf4510-3685-4620-8884-5761f03745b6 req-30613b00-9e2a-411b-8aa2-e08113316463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.396 2 DEBUG nova.compute.manager [req-61cf4510-3685-4620-8884-5761f03745b6 req-30613b00-9e2a-411b-8aa2-e08113316463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.396 2 WARNING nova.compute.manager [req-61cf4510-3685-4620-8884-5761f03745b6 req-30613b00-9e2a-411b-8aa2-e08113316463 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state error and task_state rebuilding.#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.429 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.430 2 DEBUG nova.network.neutron [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.447 2 INFO nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.466 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.586 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.587 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.587 2 INFO nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Creating image(s)#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.588 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "/var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.588 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "/var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.589 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "/var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.601 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.662 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.663 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.664 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.685 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.742 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.743 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.786 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.787 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.787 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.855 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.856 2 DEBUG nova.virt.disk.api [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Checking if we can resize image /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.856 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.881 2 DEBUG nova.network.neutron [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.882 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.908 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.909 2 DEBUG nova.virt.disk.api [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Cannot resize image /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.910 2 DEBUG nova.objects.instance [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lazy-loading 'migration_context' on Instance uuid 365f39ba-80db-4de0-ad55-45b007ea1c04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.942 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.943 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Ensure instance console log exists: /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.943 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.943 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.944 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.945 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.951 2 WARNING nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.955 2 INFO nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.956 2 DEBUG nova.virt.libvirt.host [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.956 2 DEBUG nova.virt.libvirt.host [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.960 2 DEBUG nova.virt.libvirt.host [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.961 2 DEBUG nova.virt.libvirt.host [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.963 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.964 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.964 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.965 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.965 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.965 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.966 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.966 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.966 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.966 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.967 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.967 2 DEBUG nova.virt.hardware [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.972 2 DEBUG nova.objects.instance [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lazy-loading 'pci_devices' on Instance uuid 365f39ba-80db-4de0-ad55-45b007ea1c04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.975 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance destroyed successfully.#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.981 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance destroyed successfully.#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.981 2 DEBUG nova.virt.libvirt.vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:06:55Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.982 2 DEBUG nova.network.os_vif_util [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.983 2 DEBUG nova.network.os_vif_util [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.983 2 DEBUG os_vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68972cf3-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.989 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <uuid>365f39ba-80db-4de0-ad55-45b007ea1c04</uuid>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <name>instance-00000028</name>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:name>tempest-ListImageFiltersTestJSON-server-1187507786</nova:name>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:06:59</nova:creationTime>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:user uuid="c6a7a530a085472d8ace0b41fc888e26">tempest-ListImageFiltersTestJSON-730197453-project-member</nova:user>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:        <nova:project uuid="8993ff2640584165964db6af518beb94">tempest-ListImageFiltersTestJSON-730197453</nova:project>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <entry name="serial">365f39ba-80db-4de0-ad55-45b007ea1c04</entry>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <entry name="uuid">365f39ba-80db-4de0-ad55-45b007ea1c04</entry>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.config"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/console.log" append="off"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:06:59 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:06:59 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:06:59 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:06:59 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.993 2 INFO os_vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17')#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.994 2 INFO nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deleting instance files /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80_del#033[00m
Oct  2 08:06:59 np0005466012 nova_compute[192063]: 2025-10-02 12:06:59.994 2 INFO nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deletion of /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80_del complete#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.081 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.082 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.082 2 INFO nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Using config drive#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.255 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.256 2 INFO nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating image(s)#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.257 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.258 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.259 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.259 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.260 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.569 2 INFO nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Creating config drive at /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.config#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.578 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9h99w77j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:00 np0005466012 nova_compute[192063]: 2025-10-02 12:07:00.721 2 DEBUG oslo_concurrency.processutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9h99w77j" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:00 np0005466012 systemd-machined[152114]: New machine qemu-18-instance-00000028.
Oct  2 08:07:00 np0005466012 systemd[1]: Started Virtual Machine qemu-18-instance-00000028.
Oct  2 08:07:00 np0005466012 podman[224455]: 2025-10-02 12:07:00.855915356 +0000 UTC m=+0.073003833 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.618 2 DEBUG nova.compute.manager [req-06ef8aaf-1ff5-45be-98c0-64d88c6b936e req-1793670a-31ee-4acb-a479-18ca492cdfbe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.618 2 DEBUG oslo_concurrency.lockutils [req-06ef8aaf-1ff5-45be-98c0-64d88c6b936e req-1793670a-31ee-4acb-a479-18ca492cdfbe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.619 2 DEBUG oslo_concurrency.lockutils [req-06ef8aaf-1ff5-45be-98c0-64d88c6b936e req-1793670a-31ee-4acb-a479-18ca492cdfbe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.619 2 DEBUG oslo_concurrency.lockutils [req-06ef8aaf-1ff5-45be-98c0-64d88c6b936e req-1793670a-31ee-4acb-a479-18ca492cdfbe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.619 2 DEBUG nova.compute.manager [req-06ef8aaf-1ff5-45be-98c0-64d88c6b936e req-1793670a-31ee-4acb-a479-18ca492cdfbe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.619 2 WARNING nova.compute.manager [req-06ef8aaf-1ff5-45be-98c0-64d88c6b936e req-1793670a-31ee-4acb-a479-18ca492cdfbe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state error and task_state rebuild_spawning.#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.662 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406821.6620758, 365f39ba-80db-4de0-ad55-45b007ea1c04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.663 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.665 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.665 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.668 2 INFO nova.virt.libvirt.driver [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Instance spawned successfully.#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.668 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.685 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.689 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.690 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.690 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.690 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.691 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.691 2 DEBUG nova.virt.libvirt.driver [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.694 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.723 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.724 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406821.6646426, 365f39ba-80db-4de0-ad55-45b007ea1c04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.724 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.748 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.751 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.782 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.784 2 INFO nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Took 2.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.785 2 DEBUG nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.864 2 INFO nova.compute.manager [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Took 2.76 seconds to build instance.#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.907 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:01 np0005466012 nova_compute[192063]: 2025-10-02 12:07:01.936 2 DEBUG oslo_concurrency.lockutils [None req-3352e995-df01-4acf-9f33-65f28f4b88e0 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "365f39ba-80db-4de0-ad55-45b007ea1c04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.005 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.006 2 DEBUG nova.virt.images [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] 062d9f80-76b6-42ce-bee7-0fb82a008353 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.007 2 DEBUG nova.privsep.utils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.008 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:02.119 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.255 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.part /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.261 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.325 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2.converted --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.326 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.338 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.389 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.390 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.391 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.401 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.456 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.457 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.490 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.491 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.492 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.546 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.547 2 DEBUG nova.virt.disk.api [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Checking if we can resize image /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.548 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.614 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.615 2 DEBUG nova.virt.disk.api [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Cannot resize image /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.615 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.615 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Ensure instance console log exists: /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.616 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.616 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.616 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.618 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start _get_guest_xml network_info=[{"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.623 2 WARNING nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.629 2 DEBUG nova.virt.libvirt.host [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.629 2 DEBUG nova.virt.libvirt.host [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.632 2 DEBUG nova.virt.libvirt.host [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.632 2 DEBUG nova.virt.libvirt.host [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.633 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.633 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.634 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.634 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.634 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.634 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.635 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.635 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.635 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.635 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.635 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.636 2 DEBUG nova.virt.hardware [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.636 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.653 2 DEBUG nova.virt.libvirt.vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:00Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.653 2 DEBUG nova.network.os_vif_util [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.654 2 DEBUG nova.network.os_vif_util [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.655 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <uuid>4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</uuid>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <name>instance-00000021</name>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersAdminTestJSON-server-1814602736</nova:name>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:07:02</nova:creationTime>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:user uuid="9258efa4511c4bb3813eca27b75b1008">tempest-ServersAdminTestJSON-1782354187-project-member</nova:user>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:project uuid="db3f04a20fd740c1af3139196dc928d2">tempest-ServersAdminTestJSON-1782354187</nova:project>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        <nova:port uuid="68972cf3-172b-44a5-b096-f87fe9193518">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <entry name="serial">4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</entry>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <entry name="uuid">4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</entry>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:da:05:83"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <target dev="tap68972cf3-17"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/console.log" append="off"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:07:02 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:07:02 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:07:02 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:07:02 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.656 2 DEBUG nova.compute.manager [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Preparing to wait for external event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.656 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.657 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.657 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.658 2 DEBUG nova.virt.libvirt.vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:00Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.658 2 DEBUG nova.network.os_vif_util [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.658 2 DEBUG nova.network.os_vif_util [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.659 2 DEBUG os_vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.660 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.660 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.662 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68972cf3-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.663 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68972cf3-17, col_values=(('external_ids', {'iface-id': '68972cf3-172b-44a5-b096-f87fe9193518', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:05:83', 'vm-uuid': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:02 np0005466012 NetworkManager[51207]: <info>  [1759406822.6656] manager: (tap68972cf3-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.670 2 INFO os_vif [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17')#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.719 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.719 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.719 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No VIF found with MAC fa:16:3e:da:05:83, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.720 2 INFO nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Using config drive#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.735 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:02 np0005466012 nova_compute[192063]: 2025-10-02 12:07:02.766 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'keypairs' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:03 np0005466012 nova_compute[192063]: 2025-10-02 12:07:03.693 2 INFO nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating config drive at /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config#033[00m
Oct  2 08:07:03 np0005466012 nova_compute[192063]: 2025-10-02 12:07:03.703 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpydy9et_g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:03 np0005466012 nova_compute[192063]: 2025-10-02 12:07:03.850 2 DEBUG oslo_concurrency.processutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpydy9et_g" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:03 np0005466012 kernel: tap68972cf3-17: entered promiscuous mode
Oct  2 08:07:03 np0005466012 NetworkManager[51207]: <info>  [1759406823.9507] manager: (tap68972cf3-17): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Oct  2 08:07:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:03Z|00112|binding|INFO|Claiming lport 68972cf3-172b-44a5-b096-f87fe9193518 for this chassis.
Oct  2 08:07:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:03Z|00113|binding|INFO|68972cf3-172b-44a5-b096-f87fe9193518: Claiming fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:07:03 np0005466012 nova_compute[192063]: 2025-10-02 12:07:03.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:03.972 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:05:83 10.100.0.7'], port_security=['fa:16:3e:da:05:83 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=68972cf3-172b-44a5-b096-f87fe9193518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:03.974 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 68972cf3-172b-44a5-b096-f87fe9193518 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e bound to our chassis#033[00m
Oct  2 08:07:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:03.977 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:07:03 np0005466012 systemd-udevd[224548]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:07:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:03Z|00114|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 ovn-installed in OVS
Oct  2 08:07:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:03Z|00115|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 up in Southbound
Oct  2 08:07:03 np0005466012 nova_compute[192063]: 2025-10-02 12:07:03.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:03 np0005466012 nova_compute[192063]: 2025-10-02 12:07:03.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:03 np0005466012 NetworkManager[51207]: <info>  [1759406823.9951] device (tap68972cf3-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:07:03 np0005466012 NetworkManager[51207]: <info>  [1759406823.9970] device (tap68972cf3-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:07:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:03.995 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0eae1d-3d62-4684-9cfe-c92a902df32b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:04 np0005466012 systemd-machined[152114]: New machine qemu-19-instance-00000021.
Oct  2 08:07:04 np0005466012 systemd[1]: Started Virtual Machine qemu-19-instance-00000021.
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.026 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fe89168b-f0e7-4575-879b-92ffd3f3e96a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.031 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[be78212e-ef3b-4ef2-ae89-b8a4d1210def]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:04 np0005466012 podman[224532]: 2025-10-02 12:07:04.046385064 +0000 UTC m=+0.110932938 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.063 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[90d009a1-ccf9-4cb6-8abe-c57b46f26e99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.083 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[85d963a9-6e91-4a01-ae75-f01f3ecdf83b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224571, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.105 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[937c0ff2-37bc-4624-a9c7-a76d12678184]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224573, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224573, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.108 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.114 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.114 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.115 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:04.116 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.293 2 DEBUG nova.compute.manager [req-ba530b0f-2f48-4f54-989b-1f0e82f44897 req-e2d9a1f6-b6a0-41b8-8b59-b445071d1017 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.294 2 DEBUG oslo_concurrency.lockutils [req-ba530b0f-2f48-4f54-989b-1f0e82f44897 req-e2d9a1f6-b6a0-41b8-8b59-b445071d1017 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.294 2 DEBUG oslo_concurrency.lockutils [req-ba530b0f-2f48-4f54-989b-1f0e82f44897 req-e2d9a1f6-b6a0-41b8-8b59-b445071d1017 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.294 2 DEBUG oslo_concurrency.lockutils [req-ba530b0f-2f48-4f54-989b-1f0e82f44897 req-e2d9a1f6-b6a0-41b8-8b59-b445071d1017 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.294 2 DEBUG nova.compute.manager [req-ba530b0f-2f48-4f54-989b-1f0e82f44897 req-e2d9a1f6-b6a0-41b8-8b59-b445071d1017 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Processing event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.918 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.918 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406824.9176056, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.919 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.922 2 DEBUG nova.compute.manager [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.940 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.948 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.958 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance spawned successfully.#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.959 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.963 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.984 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.985 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406824.9177728, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.985 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.992 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.992 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.996 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.997 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.997 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:04 np0005466012 nova_compute[192063]: 2025-10-02 12:07:04.998 2 DEBUG nova.virt.libvirt.driver [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.006 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.009 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406824.9253829, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.009 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.044 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.049 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.073 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.091 2 DEBUG nova.compute.manager [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.178 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.179 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.179 2 DEBUG nova.objects.instance [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:07:05 np0005466012 nova_compute[192063]: 2025-10-02 12:07:05.264 2 DEBUG oslo_concurrency.lockutils [None req-dd7a6786-db57-4c17-845c-4501525b31a9 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:06 np0005466012 nova_compute[192063]: 2025-10-02 12:07:06.449 2 DEBUG nova.compute.manager [req-53467e47-6e9b-43fc-a281-a3843386657c req-e3bc608b-fc99-4b4c-b6b5-cf8a9f9699d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:06 np0005466012 nova_compute[192063]: 2025-10-02 12:07:06.449 2 DEBUG oslo_concurrency.lockutils [req-53467e47-6e9b-43fc-a281-a3843386657c req-e3bc608b-fc99-4b4c-b6b5-cf8a9f9699d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:06 np0005466012 nova_compute[192063]: 2025-10-02 12:07:06.449 2 DEBUG oslo_concurrency.lockutils [req-53467e47-6e9b-43fc-a281-a3843386657c req-e3bc608b-fc99-4b4c-b6b5-cf8a9f9699d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:06 np0005466012 nova_compute[192063]: 2025-10-02 12:07:06.450 2 DEBUG oslo_concurrency.lockutils [req-53467e47-6e9b-43fc-a281-a3843386657c req-e3bc608b-fc99-4b4c-b6b5-cf8a9f9699d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:06 np0005466012 nova_compute[192063]: 2025-10-02 12:07:06.450 2 DEBUG nova.compute.manager [req-53467e47-6e9b-43fc-a281-a3843386657c req-e3bc608b-fc99-4b4c-b6b5-cf8a9f9699d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:06 np0005466012 nova_compute[192063]: 2025-10-02 12:07:06.450 2 WARNING nova.compute.manager [req-53467e47-6e9b-43fc-a281-a3843386657c req-e3bc608b-fc99-4b4c-b6b5-cf8a9f9699d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:07:07 np0005466012 nova_compute[192063]: 2025-10-02 12:07:07.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:08 np0005466012 nova_compute[192063]: 2025-10-02 12:07:08.626 2 INFO nova.compute.manager [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Rebuilding instance#033[00m
Oct  2 08:07:08 np0005466012 nova_compute[192063]: 2025-10-02 12:07:08.938 2 DEBUG nova.compute.manager [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.013 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.028 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.047 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'resources' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.066 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.079 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.082 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.818 2 DEBUG nova.compute.manager [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.882 2 INFO nova.compute.manager [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] instance snapshotting#033[00m
Oct  2 08:07:09 np0005466012 nova_compute[192063]: 2025-10-02 12:07:09.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:10 np0005466012 podman[224581]: 2025-10-02 12:07:10.149732846 +0000 UTC m=+0.060893930 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.165 2 INFO nova.virt.libvirt.driver [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Beginning live snapshot process#033[00m
Oct  2 08:07:10 np0005466012 podman[224601]: 2025-10-02 12:07:10.229719305 +0000 UTC m=+0.058324221 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct  2 08:07:10 np0005466012 virtqemud[191783]: invalid argument: disk vda does not have an active block job
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.509 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.590 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json -f qcow2" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.592 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.649 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.662 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.718 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.719 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpybhz6t_p/55da049e5388403681e05cf4ff5cf96f.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.751 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpybhz6t_p/55da049e5388403681e05cf4ff5cf96f.delta 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.752 2 INFO nova.virt.libvirt.driver [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.796 2 DEBUG nova.virt.libvirt.guest [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.799 2 INFO nova.virt.libvirt.driver [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.832 2 DEBUG nova.privsep.utils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:07:10 np0005466012 nova_compute[192063]: 2025-10-02 12:07:10.833 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpybhz6t_p/55da049e5388403681e05cf4ff5cf96f.delta /var/lib/nova/instances/snapshots/tmpybhz6t_p/55da049e5388403681e05cf4ff5cf96f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:11 np0005466012 nova_compute[192063]: 2025-10-02 12:07:11.095 2 DEBUG oslo_concurrency.processutils [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpybhz6t_p/55da049e5388403681e05cf4ff5cf96f.delta /var/lib/nova/instances/snapshots/tmpybhz6t_p/55da049e5388403681e05cf4ff5cf96f" returned: 0 in 0.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:11 np0005466012 nova_compute[192063]: 2025-10-02 12:07:11.096 2 INFO nova.virt.libvirt.driver [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:07:11 np0005466012 nova_compute[192063]: 2025-10-02 12:07:11.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:11 np0005466012 nova_compute[192063]: 2025-10-02 12:07:11.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:12 np0005466012 nova_compute[192063]: 2025-10-02 12:07:12.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:12 np0005466012 nova_compute[192063]: 2025-10-02 12:07:12.918 2 INFO nova.virt.libvirt.driver [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Snapshot image upload complete#033[00m
Oct  2 08:07:12 np0005466012 nova_compute[192063]: 2025-10-02 12:07:12.919 2 INFO nova.compute.manager [None req-11a84195-60f4-4704-82b9-2e4321efec46 c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Took 3.02 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:07:13 np0005466012 nova_compute[192063]: 2025-10-02 12:07:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:13 np0005466012 nova_compute[192063]: 2025-10-02 12:07:13.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:14 np0005466012 podman[224661]: 2025-10-02 12:07:14.143135287 +0000 UTC m=+0.051955920 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:07:14 np0005466012 podman[224660]: 2025-10-02 12:07:14.14324567 +0000 UTC m=+0.059205844 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:07:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:14.450 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:14.451 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.840 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.840 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.841 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.841 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.918 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.974 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:14 np0005466012 nova_compute[192063]: 2025-10-02 12:07:14.975 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.032 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.037 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.098 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.100 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.152 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.157 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.210 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.211 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.266 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.417 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.419 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5219MB free_disk=73.37276840209961GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.419 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.420 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.562 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.562 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 661e118d-4849-4ccd-a03f-a0edb70948ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.563 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 365f39ba-80db-4de0-ad55-45b007ea1c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.563 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.563 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.637 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.653 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.679 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:07:15 np0005466012 nova_compute[192063]: 2025-10-02 12:07:15.679 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:16 np0005466012 nova_compute[192063]: 2025-10-02 12:07:16.680 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:16 np0005466012 nova_compute[192063]: 2025-10-02 12:07:16.682 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:16 np0005466012 nova_compute[192063]: 2025-10-02 12:07:16.682 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.920 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'name': 'tempest-ServersAdminTestJSON-server-64695182', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000024', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'db3f04a20fd740c1af3139196dc928d2', 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'hostId': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.923 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8993ff2640584165964db6af518beb94', 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'hostId': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.925 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'name': 'tempest-ServersAdminTestJSON-server-1814602736', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'db3f04a20fd740c1af3139196dc928d2', 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'hostId': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.943 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.write.bytes volume: 72888320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.943 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.964 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.965 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.985 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.write.bytes volume: 25645056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.986 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fa372c5-f062-4d75-8e44-efec2e2c6908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72888320, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:16.926526', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56d1b206-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': 'e785025b12ae0a725ab4427317a720971d1b846624147640115b3cc76fb1cba4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:16.926526', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56d1c05c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': 'c01ffdd1f094d6044879333226a34065ba1398ceae4d69c4631aeddf37980f62'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:16.926526', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56d5000a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': 'f0c6f3b52549c7aab7dbef9b5c828fb7aa04285c27d121fdd31d464c40506089'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:16.926526', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56d50e4c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': 'e54aae952133aaf3a0cf43680a50a2e05325a0c496453a7c068913231406bbbf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25645056, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:16.926526', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56d8307c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '93d2b3bebf8f772064e146098a42df2051c7194d3423eba4bd63a8b6e522290e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:16.926526', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'archi
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: : 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56d83f0e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': 'd9c8f93f6e98256658e752f41ab72d3b7b6cb7c35de9feebdb10f190c33d5d6a'}]}, 'timestamp': '2025-10-02 12:07:16.986847', '_unique_id': 'b9c48830a61440afb1414493bb6b6aa0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.992 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 661e118d-4849-4ccd-a03f-a0edb70948ea / tapbfbbaed4-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.992 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.incoming.bytes volume: 2024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.995 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 / tap68972cf3-17 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.995 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.incoming.bytes volume: 1000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4371e12-8547-439c-b5d6-e808f8a525e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2024, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:16.989638', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56d9321a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '5cec4e335c0ac58b86452385ff472562a9923ccdfdd05951a93b3945e131f341'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1000, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:16.989638', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56d9a358-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '6a4c4b3257f2e20ed1185f8c94bf5e6d661bd6dcf0fd02b3ef51c8eec2e6d2b3'}]}, 'timestamp': '2025-10-02 12:07:16.995888', '_unique_id': '9cf7877e85a84e9f94b264ea953c5482'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.996 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.997 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.997 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a37d377b-2fe1-46d6-b8db-aabfebb3e336', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:16.997532', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56d9ece6-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '1b08f29632ef545f77b6a40e51f518c0c7f793810afcbb600cf3c903d2000795'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:16.997532', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56d9f61e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '6b264e056e93c49fa006df9187ab4373d6093237d2258e870317aee79ff6143b'}]}, 'timestamp': '2025-10-02 12:07:16.998015', '_unique_id': '9fb683355edb49d4a14879efc3b0884f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.999 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.999 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>]
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:16.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.013 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.014 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.022 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.023 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.033 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.033 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03829609-1bf0-4242-9922-d64c94bf04b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:16.999558', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56dc6f5c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.675980214, 'message_signature': '01864c6d775be81f7f2185e1fcdcddcb41369e438c3c1c040c586796c5e312c8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:16.999558', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56dc7cfe-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.675980214, 'message_signature': '31907973c3fe3dc313cde2ba831b91b2dcdf5473e1e7dd912d64b51aa69347ec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:16.999558', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ddd716-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.690955384, 'message_signature': '1b90af905ace78b2f8911e19ff7397d65fcb5bf6a83e7f7a8d881f2e6412324b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:16.999558', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56dde1f2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.690955384, 'message_signature': '8f0ad926f29c914f030e5a50eaf932f8a76071ceba3a172dba738fea7e79bcbd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:16.999558', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56df6612-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.700262723, 'message_signature': 'a049790147dcca16d7969b2a17e2f337784cdd63cf1e80f8799b1c146d46aebd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:16.999558', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, '
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: ': '56df718e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.700262723, 'message_signature': 'ec6d5366c4e240532b20b5a731ec41b1377108374a2bdc0e54b93518f537fee2'}]}, 'timestamp': '2025-10-02 12:07:17.033935', '_unique_id': 'd55ce357a39b4ca38000604acd88140e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.036 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.036 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32c39e23-d4fe-487e-b599-601800066f58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.036134', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56dfd142-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '41a5b792781148d599c3d4f8e5203d7003eb99f368b38c7cdbfdfd8870b42b61'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.036134', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56dfd9a8-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': 'e0dd3100faf1e4e039e27727743e57270683fba717587aa58ad570877388d8cd'}]}, 'timestamp': '2025-10-02 12:07:17.036591', '_unique_id': '84ea5b383e38408fa8956006c844d92d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.read.latency volume: 678487341 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.037 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.read.latency volume: 34267029 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.038 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.read.latency volume: 591974283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.038 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.read.latency volume: 125140963 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.038 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.read.latency volume: 698298503 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.038 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.read.latency volume: 225221566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63a71dc7-69e9-4669-b7e5-4cf0fa3776d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 678487341, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.037781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56e010e4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '67ad8cbaed15e5ab8b2ae26f5b8ced20357169848b79051f27cf12aad795357b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 34267029, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.037781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56e018f0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '8d14d6197e835916e1f82b39478290148aed0b4ce9feea17ce492211cd9eee89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 591974283, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.037781', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56e02084-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': 'a70d248334c3dbf0d3243e6a017384b857529c800fddf4dcf3342e3431e9de1e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 125140963, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.037781', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56e027d2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '7e9879cf8e1042e1287ad47d206d505d14bd2bf0f8ee933dd5179874737e311f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 698298503, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.037781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56e02f2a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': 'e139a70bb8709487a6fc4551f4c2fff9d7991d7f1a1f24fbe4d9354d07ecceef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 225221566, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.037781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56e0374a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': 'bb578166aed942747318bec4c1bbc57de58885a79358a9284cf6a733a25d1333'}]}, 'timestamp': '2025-10-02 12:07:17.038969', '_unique_id': 'fdba4aa19ab34a709f031eb6b786e0cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>]
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>]
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.040 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.055 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/memory.usage volume: 42.4453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.071 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/memory.usage volume: 40.36328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.086 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/memory.usage volume: 40.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55a262a3-140c-460f-9229-0865ebd883d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4453125, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'timestamp': '2025-10-02T12:07:17.040783', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '56e2d040-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.731801777, 'message_signature': '3e1657c503410a575860bf2ad29799942482df2a0e51906d9ff445a36ec5a2a9'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.36328125, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'timestamp': '2025-10-02T12:07:17.040783', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '56e53150-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.747299551, 'message_signature': '96f71327ed1c00ffa9e7a8eb2ae0b0661e46ca8e80a0161c0d56050e2ccc0d24'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41015625, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'timestamp': '2025-10-02T12:07:17.040783', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '56e798aa-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.762908208, 'message_signature': 'e3e969e7b83b3af98ed3f460e50817b19db6046c27e0eab4874b656ec3acdbb7'}]}, 'timestamp': '2025-10-02 12:07:17.087554', '_unique_id': '9477ccb676cd4a26ba9c45ae5eee583e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.089 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.090 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.090 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.outgoing.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b65a61c-9418-4ca4-b936-fb74f4874566', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.090186', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56e81294-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '4a820289dc39760a9eadec3594cb65e57308c32dd2df9bc8cbf67a4dd79e1d5a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.090186', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56e81e4c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': 'd817218740dbe788b73b05e31d001388b0282625c439b9b6c7bcf68fd6d453dc'}]}, 'timestamp': '2025-10-02 12:07:17.090851', '_unique_id': '3ea67aeccd3842e99cd3c5a47f9cd22c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.091 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.092 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.092 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84ff25fa-2dfb-41ca-9252-14a58f41317b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.092547', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56e86e2e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '440ddcf1df67c06aa80cadaa37e0647377ab1d15d9f97e359fa131c47dbe83c3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.092547', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56e876bc-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '886efc345338b0235b3007f3041a2a5e9b3789c44aeb7ddfffc7093e9dcd169d'}]}, 'timestamp': '2025-10-02 12:07:17.093055', '_unique_id': 'fb1cebccc3fa41ae883884a80409bc96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.093 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.094 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.094 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.outgoing.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '349078fd-5fed-40aa-a2c0-b3af8d697b18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.094223', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56e8add0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': 'b76851240c09d96e896807b7e5a1a2dc04fe802f4cff689d5bad895e99ce5373'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.094223', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56e8b762-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '719a0ea179fc30161ac6a2604ac52099091d378112b8c4ef0de4f07b989e2d75'}]}, 'timestamp': '2025-10-02 12:07:17.094712', '_unique_id': 'be96afb19c2948ad9e3bfc3e409833c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.096 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.096 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2be18058-f78e-447d-8c3c-a2c63cd3865d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.096000', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56e8f434-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '102193a21567698049cbc266a85aa1640c7584c76a45d22c7f7b2165d4235419'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.096000', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56e9014a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '7d714b26cfd003f3d47fa24fced97408e500022dbff2326779742ff5114e7fdc'}]}, 'timestamp': '2025-10-02 12:07:17.096613', '_unique_id': '5dd6dd88ce5b40e5965707e5a76638c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.097 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.098 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26815b1f-9f21-4bd4-803f-48bb5efd068b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.097943', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56e93f48-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '7e08b1e4c52c34f79b5443020480f68197667f5b77c431e3e3d3266fd3cfe98f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.097943', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56e947c2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '1c2fcbcaae2951ba724fd52fed297f9aab864822741ea0c039f1241cb636fda1'}]}, 'timestamp': '2025-10-02 12:07:17.098428', '_unique_id': '357efa4499a0413a84d8cbd1cf9bfc03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.099 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/cpu volume: 11840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.100 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/cpu volume: 11220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.100 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/cpu volume: 10970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ea05942-9bb6-4f83-892c-881b34139bb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11840000000, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'timestamp': '2025-10-02T12:07:17.099744', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '56e98782-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.731801777, 'message_signature': '58fb837354fac7e18f94dc3fcf038fe9e5d718b70a7f58e768113e160cf2fff8'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11220000000, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'timestamp': '2025-10-02T12:07:17.099744', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '56e99006-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.747299551, 'message_signature': '8e7a71097a3ba1b7a3f6f181dcd6096b906a7464df2ce59e435760e682644301'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10970000000, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'timestamp': '2025-10-02T12:07:17.099744', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '56e99790-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.762908208, 'message_signature': 'a205e4d755ecf3e413fdf3083f7b081072ccc3ce305d0896f77c719848cd96f3'}]}, 'timestamp': '2025-10-02 12:07:17.100421', '_unique_id': '74ba9eb197784b09bf5b647d2273a2bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.write.latency volume: 3138145458 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.101 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.102 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.write.latency volume: 2488814091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.102 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.102 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.write.latency volume: 2880171873 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.102 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '813d6ce8-b82f-462e-a5aa-83f40a6fc163', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3138145458, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.101677', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56e9d318-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': 'e1cee96bf1aa36dc03336b37f51b5ff5316625330f703364b138a072e77d2507'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.101677', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56e9dde0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '8245bde947a38eab5937a2a837a3c97ba1e1dfdcb69cacd94e8002b86edad15f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2488814091, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.101677', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56e9e5e2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '525bea97c15d64aa2bf595df4998f623ca3dff726572bceaa3512daad1c18087'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.101677', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56e9ed44-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '00e73a904f17a1fb3cf0dbf0d8e14328d229a45763002534a9f7bc93d29a1aaa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2880171873, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.101677', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56e9f64a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': 'afbf7b33bd86de03dec9da325d20f73b56f57eebb6a099409ee41b58e47496ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.101677', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'imag
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: k_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56e9ff32-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': 'd0e8073055dc1ee0fb4af17f560b58591403ff4e1af126f977504917219aab2e'}]}, 'timestamp': '2025-10-02 12:07:17.103098', '_unique_id': 'b36f893cfc4f42269fb7f3b6e86d6532'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.104 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.104 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.104 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.read.requests volume: 1072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.105 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.105 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.read.requests volume: 975 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.105 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e8f3286-9328-4940-bdb2-a66ccfbdb1d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.104495', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ea3f2e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': 'f8cfbceaf61fd1c014302072a887d67be36b0670cb175cf758c19a2af7aa6a3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.104495', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ea4956-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '71818ab8ac27b0901c749d3f81bf3cb9b9d4b17b71595723d843cd394f4904b9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1072, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.104495', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ea51da-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '1e46327e7f168dc77e6b5497bac07cdb55ec22b43ccbb619da8bc6e04d12f51b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.104495', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ea5aae-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '6f8b0ec217096a013d23904477a789910356c0873859b77d57f99f471a7d98a9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 975, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.104495', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ea621a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '87b82ceb17d09d57729107fb317395c69a681684a359492d696393f8de2ba65f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.104495', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb8
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: y_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ea6a44-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '46004232e8fa004e16145a91af5ed4ae3d2376d8a264b0f4be8592a16654b490'}]}, 'timestamp': '2025-10-02 12:07:17.105828', '_unique_id': 'f59fe5f80a85485995bccdc175ca6937'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.107 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.write.requests volume: 290 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.107 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.107 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.107 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.108 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.write.requests volume: 234 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.108 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'caa37f2b-438b-4370-8793-3345781855c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 290, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.107179', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56eaa9b4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '9c930ba1a46bbe57881cba59fe21f7c4f0939b3778335ad003262fcd503eac66'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.107179', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56eab184-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': 'fde53c0ecf8bd4dc2240016dcb959a90d411757ce4093c250fb9fe8b52999efe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.107179', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56eab9f4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '63c0c47161b48da39b45666baa61470252d93d30773fd5d7676a478cb0fdba90'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.107179', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56eac156-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '02a74aaac43aa6a86c47cabf335cd3d446596f6b28df46fed3c9e6d3cba2c38b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 234, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.107179', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56eac89a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '17ae8a0dc25d5c79e7b8d547012e12a410f00efefac8b1d8443f9cadcff2a4a5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.107179', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: _mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ead100-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '46bb9a04e032396742263d2ab2ccfb05f31a0c648b824cbfb36665b8632387e4'}]}, 'timestamp': '2025-10-02 12:07:17.108450', '_unique_id': '0e5d56802a4046cc926c72771f4621b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.109 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-64695182>, <NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1187507786>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1814602736>]
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.110 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.110 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78415332-da3e-4cc7-904e-2b5f356e7650', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.110168', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56eb1cb4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': '71fe84e41ba655209f14b19220ff94a601de5023e37a5552546112296b16b01d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.110168', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56eb26d2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '2134c12fc8a8329b08c95ed19ace363d8df8a21b57dd9e2b680f4086a815a0d8'}]}, 'timestamp': '2025-10-02 12:07:17.110653', '_unique_id': '065a48c616864890a61532a26bf1dbf4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.111 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.112 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.112 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.112 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.112 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.allocation volume: 28450816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.112 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f97c21c-58c5-4e0d-b9bc-6bef15fb3db6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.111873', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56eb5f12-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.675980214, 'message_signature': '25082b7773c8cd50024c4cdfd4e405ab5f4ecf28cf2320efb08af7ea954114a5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.111873', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56eb669c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.675980214, 'message_signature': '41b1bf0e38ccd26e833ac6688d0b6433f81c115ad3dd2fa2f595a8e646cbfcda'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.111873', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56eb6e1c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.690955384, 'message_signature': '40b02e85452ec8b818fa9eb4407b3db1b69c54107eeeff6890a5b4ae9b90147d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.111873', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56eb770e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.690955384, 'message_signature': '48c10290f66142a594f731b01f7e5241c270495d59d5c21aff5565b5ad86872b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28450816, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.111873', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56eb7f92-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.700262723, 'message_signature': '3f2c294f78e7aef4fa47b7c21353ad1688316bbf58785fbeae7cbc4182ad335d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.111873', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', '
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: isk_name': 'sda'}, 'message_id': '56eb86d6-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.700262723, 'message_signature': '64c50af1b67259d45814afdeb08839b72fd2b91027b499b82fbfb1bfdd81c5d5'}]}, 'timestamp': '2025-10-02 12:07:17.113098', '_unique_id': '38c13d83c76c4eb3aa392f982713de01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.114 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.114 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.114 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.114 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.115 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.115 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d618661-531f-438a-a016-ccb2c4daff04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.114293', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ebbd7c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.675980214, 'message_signature': 'f61df488cca94f5fc942901f6565e7183ab5c906a89cd9df6b83078dd0c74a90'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.114293', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ebc556-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.675980214, 'message_signature': '859d09f4ba08c7179cee00530c40c3909f1d1a2b3b3f7aecefde34290a5d8fc0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.114293', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ebced4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.690955384, 'message_signature': '5ef4ae7a8bacb96306cb31f2c9225a6fb963f523b310006b8c3c75f0f1469dd1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.114293', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ebd636-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.690955384, 'message_signature': 'bd8f27e2d98e100b717b205a2d0bf075db95155c0da0f78084bb9b192bc30cdf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.114293', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ebde38-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.700262723, 'message_signature': 'baba2a061fdbc0c78b49e7b7ae0dffba9da8b0dedf1593cd4de6dfa097cf1c1b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.114293', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_typ
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: me': 'sda'}, 'message_id': '56ebe5a4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.700262723, 'message_signature': '11ccfe3d38cea30bccc4ae64e6c50767a9ac209e44dfc5b58219c44c9809e1b4'}]}, 'timestamp': '2025-10-02 12:07:17.115525', '_unique_id': '76f84a40709640f68e25cccacecb42af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.117 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.117 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6416837-afdd-416a-aa75-e49ab95ae8fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000024-661e118d-4849-4ccd-a03f-a0edb70948ea-tapbfbbaed4-8d', 'timestamp': '2025-10-02T12:07:17.117315', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'tapbfbbaed4-8d', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:11:35', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfbbaed4-8d'}, 'message_id': '56ec341e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.666063598, 'message_signature': 'c55d191cbb79d5afb0185e94d1dce4a1dfd20831fc798112d81f83e58b175cc7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': 'instance-00000021-4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-tap68972cf3-17', 'timestamp': '2025-10-02T12:07:17.117315', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'tap68972cf3-17', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:05:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68972cf3-17'}, 'message_id': '56ec3ca2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.670449836, 'message_signature': '2f93fc674b5c34743eb4ab913c186479f7e52a2e648d118acc1a3ce21a013aff'}]}, 'timestamp': '2025-10-02 12:07:17.117823', '_unique_id': '2dc29db6aed546e98c285a0ae4c4ab5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.119 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.119 12 DEBUG ceilometer.compute.pollsters [-] 661e118d-4849-4ccd-a03f-a0edb70948ea/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.119 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.read.bytes volume: 29964800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.119 12 DEBUG ceilometer.compute.pollsters [-] 365f39ba-80db-4de0-ad55-45b007ea1c04/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.119 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.read.bytes volume: 28371968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 DEBUG ceilometer.compute.pollsters [-] 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7cec1f4-00ab-4dcc-a83e-464a1b931c76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-vda', 'timestamp': '2025-10-02T12:07:17.119006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ec764a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '147a2d015351af9f3174246c0c528bfbccae0a830fc52d8fdda5cf321e5dc120'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '661e118d-4849-4ccd-a03f-a0edb70948ea-sda', 'timestamp': '2025-10-02T12:07:17.119006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-64695182', 'name': 'instance-00000024', 'instance_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ec7e2e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.602900179, 'message_signature': '968d53d8c48a6ee773eaba6b9e928076638df9e29641632bed08103be756f7bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29964800, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-vda', 'timestamp': '2025-10-02T12:07:17.119006', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ec8586-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '23cd6d2d871d05ffab639267a0e82cee9d5dea3eab9ad142c61ec67dbc83256d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'c6a7a530a085472d8ace0b41fc888e26', 'user_name': None, 'project_id': '8993ff2640584165964db6af518beb94', 'project_name': None, 'resource_id': '365f39ba-80db-4de0-ad55-45b007ea1c04-sda', 'timestamp': '2025-10-02T12:07:17.119006', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1187507786', 'name': 'instance-00000028', 'instance_id': '365f39ba-80db-4de0-ad55-45b007ea1c04', 'instance_type': 'm1.nano', 'host': '3778044e2c59b6373f0c67e7aaa4c60697c1e3738f691c7732224ff2', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ec8d60-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.620585702, 'message_signature': '11bc8192f1f7dd29de1daeaf6844e1e1447ba5341ccc2144be4ef264c192bb3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28371968, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-vda', 'timestamp': '2025-10-02T12:07:17.119006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '56ec9634-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '9dee54576e94f89b118fc029b1261187f602d5d5e4a229721d36f0b3079f62f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '9258efa4511c4bb3813eca27b75b1008', 'user_name': None, 'project_id': 'db3f04a20fd740c1af3139196dc928d2', 'project_name': None, 'resource_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-sda', 'timestamp': '2025-10-02T12:07:17.119006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1814602736', 'name': 'instance-00000021', 'instance_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'instance_type': 'm1.nano', 'host': '93e674503b756404d144790ea01c8f5ac14d71b2cc4e5027ee9033da', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': Non
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: meral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '56ec9da0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4830.642253502, 'message_signature': '15ca34b34184cc56c3c44fcc3eb868bbe6b35df4611773bd44b188af43cd8cdb'}]}, 'timestamp': '2025-10-02 12:07:17.120234', '_unique_id': '6fd1f025a95945149db39ab8d5a2846f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:07:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:16.988 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.034 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.039 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.103 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.106 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.109 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.113 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.116 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 rsyslogd[1011]: message too long (8192) with configured size 8096, begin of message is: 2025-10-02 12:07:17.120 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  2 08:07:17 np0005466012 nova_compute[192063]: 2025-10-02 12:07:17.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:18Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:07:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:18Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:07:19 np0005466012 nova_compute[192063]: 2025-10-02 12:07:19.132 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:07:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:19.453 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:19 np0005466012 nova_compute[192063]: 2025-10-02 12:07:19.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:07:19 np0005466012 nova_compute[192063]: 2025-10-02 12:07:19.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:07:19 np0005466012 nova_compute[192063]: 2025-10-02 12:07:19.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:07:19 np0005466012 nova_compute[192063]: 2025-10-02 12:07:19.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:20 np0005466012 nova_compute[192063]: 2025-10-02 12:07:20.456 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:20 np0005466012 nova_compute[192063]: 2025-10-02 12:07:20.457 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:20 np0005466012 nova_compute[192063]: 2025-10-02 12:07:20.457 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:07:20 np0005466012 nova_compute[192063]: 2025-10-02 12:07:20.458 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 661e118d-4849-4ccd-a03f-a0edb70948ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:21 np0005466012 kernel: tap68972cf3-17 (unregistering): left promiscuous mode
Oct  2 08:07:21 np0005466012 NetworkManager[51207]: <info>  [1759406841.3003] device (tap68972cf3-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:07:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:21Z|00116|binding|INFO|Releasing lport 68972cf3-172b-44a5-b096-f87fe9193518 from this chassis (sb_readonly=0)
Oct  2 08:07:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:21Z|00117|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 down in Southbound
Oct  2 08:07:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:21Z|00118|binding|INFO|Removing iface tap68972cf3-17 ovn-installed in OVS
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.331 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:05:83 10.100.0.7'], port_security=['fa:16:3e:da:05:83 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=68972cf3-172b-44a5-b096-f87fe9193518) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.332 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 68972cf3-172b-44a5-b096-f87fe9193518 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e unbound from our chassis#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.334 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.351 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fd589588-57d7-48c0-81f5-272f11edf0b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:21 np0005466012 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000021.scope: Deactivated successfully.
Oct  2 08:07:21 np0005466012 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000021.scope: Consumed 13.496s CPU time.
Oct  2 08:07:21 np0005466012 systemd-machined[152114]: Machine qemu-19-instance-00000021 terminated.
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.378 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f63105-8dd9-4d0a-9a45-c561d831402c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.380 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ccf3ec-a92f-4067-a8d6-4629d8dc363e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.406 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[673c67b2-8155-46af-931d-82f42d113856]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.426 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[286ff630-c0d5-4c37-983f-e06a737a2ad9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224739, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.443 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[21b09a5c-9aad-484c-a063-baef3746d9de]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224740, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224740, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.444 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.452 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.452 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.452 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:21.453 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.758 2 DEBUG nova.compute.manager [req-0c8aa724-ed50-4129-b34f-5bf8dad84396 req-09e1cad0-e7ff-4c64-82de-cac81cba9636 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.759 2 DEBUG oslo_concurrency.lockutils [req-0c8aa724-ed50-4129-b34f-5bf8dad84396 req-09e1cad0-e7ff-4c64-82de-cac81cba9636 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.759 2 DEBUG oslo_concurrency.lockutils [req-0c8aa724-ed50-4129-b34f-5bf8dad84396 req-09e1cad0-e7ff-4c64-82de-cac81cba9636 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.759 2 DEBUG oslo_concurrency.lockutils [req-0c8aa724-ed50-4129-b34f-5bf8dad84396 req-09e1cad0-e7ff-4c64-82de-cac81cba9636 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.759 2 DEBUG nova.compute.manager [req-0c8aa724-ed50-4129-b34f-5bf8dad84396 req-09e1cad0-e7ff-4c64-82de-cac81cba9636 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:21 np0005466012 nova_compute[192063]: 2025-10-02 12:07:21.760 2 WARNING nova.compute.manager [req-0c8aa724-ed50-4129-b34f-5bf8dad84396 req-09e1cad0-e7ff-4c64-82de-cac81cba9636 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state active and task_state rebuilding.#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.144 2 INFO nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.149 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance destroyed successfully.#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.154 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance destroyed successfully.#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.154 2 DEBUG nova.virt.libvirt.vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:07:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:08Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.155 2 DEBUG nova.network.os_vif_util [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.155 2 DEBUG nova.network.os_vif_util [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.156 2 DEBUG os_vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68972cf3-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.164 2 INFO os_vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17')#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.164 2 INFO nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deleting instance files /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80_del#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.165 2 INFO nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deletion of /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80_del complete#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.455 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.457 2 INFO nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating image(s)#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.458 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.459 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.460 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.485 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.543 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.544 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.545 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.557 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.617 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Updating instance_info_cache with network_info: [{"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.620 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.621 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.643 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-661e118d-4849-4ccd-a03f-a0edb70948ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.644 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.661 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.662 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.663 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.717 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.718 2 DEBUG nova.virt.disk.api [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Checking if we can resize image /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.718 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.774 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.775 2 DEBUG nova.virt.disk.api [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Cannot resize image /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.776 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.776 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Ensure instance console log exists: /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.776 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.777 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.777 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.779 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start _get_guest_xml network_info=[{"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.784 2 WARNING nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.790 2 DEBUG nova.virt.libvirt.host [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.790 2 DEBUG nova.virt.libvirt.host [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.793 2 DEBUG nova.virt.libvirt.host [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.794 2 DEBUG nova.virt.libvirt.host [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.795 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.795 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.795 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.796 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.796 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.796 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.796 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.797 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.797 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.797 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.798 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.798 2 DEBUG nova.virt.hardware [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.798 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.816 2 DEBUG nova.virt.libvirt.vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:07:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:22Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.816 2 DEBUG nova.network.os_vif_util [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.817 2 DEBUG nova.network.os_vif_util [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.819 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <uuid>4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</uuid>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <name>instance-00000021</name>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersAdminTestJSON-server-1814602736</nova:name>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:07:22</nova:creationTime>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:user uuid="9258efa4511c4bb3813eca27b75b1008">tempest-ServersAdminTestJSON-1782354187-project-member</nova:user>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:project uuid="db3f04a20fd740c1af3139196dc928d2">tempest-ServersAdminTestJSON-1782354187</nova:project>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        <nova:port uuid="68972cf3-172b-44a5-b096-f87fe9193518">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <entry name="serial">4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</entry>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <entry name="uuid">4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80</entry>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:da:05:83"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <target dev="tap68972cf3-17"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/console.log" append="off"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:07:22 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:07:22 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:07:22 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:07:22 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.821 2 DEBUG nova.compute.manager [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Preparing to wait for external event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.821 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.822 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.822 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.823 2 DEBUG nova.virt.libvirt.vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:07:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:07:22Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.823 2 DEBUG nova.network.os_vif_util [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.824 2 DEBUG nova.network.os_vif_util [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.824 2 DEBUG os_vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.825 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.826 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68972cf3-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68972cf3-17, col_values=(('external_ids', {'iface-id': '68972cf3-172b-44a5-b096-f87fe9193518', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:05:83', 'vm-uuid': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 NetworkManager[51207]: <info>  [1759406842.8319] manager: (tap68972cf3-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.837 2 INFO os_vif [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17')#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.900 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.901 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.901 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] No VIF found with MAC fa:16:3e:da:05:83, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.901 2 INFO nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Using config drive#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.916 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:22 np0005466012 nova_compute[192063]: 2025-10-02 12:07:22.947 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'keypairs' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.631 2 INFO nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Creating config drive at /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.642 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84hro1x9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.783 2 DEBUG oslo_concurrency.processutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84hro1x9" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:07:23 np0005466012 kernel: tap68972cf3-17: entered promiscuous mode
Oct  2 08:07:23 np0005466012 systemd-udevd[224731]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:07:23 np0005466012 NetworkManager[51207]: <info>  [1759406843.8511] manager: (tap68972cf3-17): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Oct  2 08:07:23 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:23Z|00119|binding|INFO|Claiming lport 68972cf3-172b-44a5-b096-f87fe9193518 for this chassis.
Oct  2 08:07:23 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:23Z|00120|binding|INFO|68972cf3-172b-44a5-b096-f87fe9193518: Claiming fa:16:3e:da:05:83 10.100.0.7
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:23 np0005466012 NetworkManager[51207]: <info>  [1759406843.8616] device (tap68972cf3-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:07:23 np0005466012 NetworkManager[51207]: <info>  [1759406843.8623] device (tap68972cf3-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.868 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:05:83 10.100.0.7'], port_security=['fa:16:3e:da:05:83 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=68972cf3-172b-44a5-b096-f87fe9193518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.870 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 68972cf3-172b-44a5-b096-f87fe9193518 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e bound to our chassis#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.871 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:07:23 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:23Z|00121|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 ovn-installed in OVS
Oct  2 08:07:23 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:23Z|00122|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 up in Southbound
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.888 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[46680721-da8a-4c28-add7-8c54d4f0d8a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:23 np0005466012 systemd-machined[152114]: New machine qemu-20-instance-00000021.
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.917 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0c0e1c-ba65-4011-af54-0f9f5368e76d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.921 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ef36ef06-01fb-4bea-a1e2-052bf877fb76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:23 np0005466012 systemd[1]: Started Virtual Machine qemu-20-instance-00000021.
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.951 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1544ff88-f0db-4cc1-968a-4ad137ab9440]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.971 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bccb6c5a-6864-4bcd-b473-bc7217ffc726]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 1000, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 1000, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224803, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.989 2 DEBUG nova.compute.manager [req-c8562d8f-4cd2-4c4d-a206-01a467872c4e req-34f75de4-abb3-4a8e-a5f2-b8ae2d8b2d4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.989 2 DEBUG oslo_concurrency.lockutils [req-c8562d8f-4cd2-4c4d-a206-01a467872c4e req-34f75de4-abb3-4a8e-a5f2-b8ae2d8b2d4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.989 2 DEBUG oslo_concurrency.lockutils [req-c8562d8f-4cd2-4c4d-a206-01a467872c4e req-34f75de4-abb3-4a8e-a5f2-b8ae2d8b2d4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.990 2 DEBUG oslo_concurrency.lockutils [req-c8562d8f-4cd2-4c4d-a206-01a467872c4e req-34f75de4-abb3-4a8e-a5f2-b8ae2d8b2d4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.990 2 DEBUG nova.compute.manager [req-c8562d8f-4cd2-4c4d-a206-01a467872c4e req-34f75de4-abb3-4a8e-a5f2-b8ae2d8b2d4f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Processing event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.993 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[736ebbe4-d0ac-492d-951f-2beb2e648ff4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224807, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224807, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.994 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:23 np0005466012 nova_compute[192063]: 2025-10-02 12:07:23.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.998 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.998 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.999 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:23.999 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.804 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.805 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406844.8041952, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.805 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Started (Lifecycle Event)#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.807 2 DEBUG nova.compute.manager [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.811 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.813 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance spawned successfully.#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.813 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.835 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.843 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.847 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.847 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.848 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.848 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.848 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.849 2 DEBUG nova.virt.libvirt.driver [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.881 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.881 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406844.805531, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.881 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.911 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.913 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406844.8117816, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.913 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.935 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.938 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.963 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:07:24 np0005466012 nova_compute[192063]: 2025-10-02 12:07:24.968 2 DEBUG nova.compute.manager [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:25 np0005466012 nova_compute[192063]: 2025-10-02 12:07:25.059 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:25 np0005466012 nova_compute[192063]: 2025-10-02 12:07:25.060 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:25 np0005466012 nova_compute[192063]: 2025-10-02 12:07:25.060 2 DEBUG nova.objects.instance [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:07:25 np0005466012 nova_compute[192063]: 2025-10-02 12:07:25.143 2 DEBUG oslo_concurrency.lockutils [None req-75b767fc-bee1-42da-b9e5-bff36bf2e59a 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.084 2 DEBUG nova.compute.manager [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.084 2 DEBUG oslo_concurrency.lockutils [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.084 2 DEBUG oslo_concurrency.lockutils [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.084 2 DEBUG oslo_concurrency.lockutils [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.085 2 DEBUG nova.compute.manager [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.085 2 WARNING nova.compute.manager [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.085 2 DEBUG nova.compute.manager [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.085 2 DEBUG oslo_concurrency.lockutils [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.086 2 DEBUG oslo_concurrency.lockutils [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.086 2 DEBUG oslo_concurrency.lockutils [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.086 2 DEBUG nova.compute.manager [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:26 np0005466012 nova_compute[192063]: 2025-10-02 12:07:26.086 2 WARNING nova.compute.manager [req-65056487-131b-4b2a-8321-2675931f4dfd req-5b1d1c80-65c1-4807-8688-626d95bfaf0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:07:27 np0005466012 nova_compute[192063]: 2025-10-02 12:07:27.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:28 np0005466012 podman[224817]: 2025-10-02 12:07:28.162109672 +0000 UTC m=+0.067855271 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:07:28 np0005466012 podman[224818]: 2025-10-02 12:07:28.296763185 +0000 UTC m=+0.202162135 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:07:29 np0005466012 nova_compute[192063]: 2025-10-02 12:07:29.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 podman[224867]: 2025-10-02 12:07:31.135414012 +0000 UTC m=+0.047709888 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.309 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.310 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.310 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.310 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.311 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.326 2 INFO nova.compute.manager [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Terminating instance#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.339 2 DEBUG nova.compute.manager [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:07:31 np0005466012 kernel: tapbfbbaed4-8d (unregistering): left promiscuous mode
Oct  2 08:07:31 np0005466012 NetworkManager[51207]: <info>  [1759406851.3721] device (tapbfbbaed4-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00123|binding|INFO|Releasing lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab from this chassis (sb_readonly=0)
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00124|binding|INFO|Setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab down in Southbound
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00125|binding|INFO|Removing iface tapbfbbaed4-8d ovn-installed in OVS
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.389 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:11:35 10.100.0.4'], port_security=['fa:16:3e:e1:11:35 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=bfbbaed4-8dd4-4ee3-bc31-049983ebccab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.390 103246 INFO neutron.agent.ovn.metadata.agent [-] Port bfbbaed4-8dd4-4ee3-bc31-049983ebccab in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e unbound from our chassis#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.391 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.409 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aac75210-245c-40a9-9d14-35d5dfc7664d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000024.scope: Deactivated successfully.
Oct  2 08:07:31 np0005466012 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000024.scope: Consumed 14.710s CPU time.
Oct  2 08:07:31 np0005466012 systemd-machined[152114]: Machine qemu-17-instance-00000024 terminated.
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.441 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9171f5-cb57-4668-ad60-bc03bdfa1855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.444 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[df263602-66cc-49ea-b915-f039c4d64a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.474 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[51b056e6-9d5c-4bcb-9d3b-5bb719812213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.491 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9600c04a-105c-4cb9-b7d3-937960fd0e94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 1000, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 1000, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224898, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.503 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f9309f62-f1a9-42f0-80d2-bb6710ede95e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224899, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224899, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.505 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.512 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.513 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.513 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.513 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:31 np0005466012 kernel: tapbfbbaed4-8d: entered promiscuous mode
Oct  2 08:07:31 np0005466012 systemd-udevd[224892]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:07:31 np0005466012 NetworkManager[51207]: <info>  [1759406851.5629] manager: (tapbfbbaed4-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00126|binding|INFO|Claiming lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab for this chassis.
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00127|binding|INFO|bfbbaed4-8dd4-4ee3-bc31-049983ebccab: Claiming fa:16:3e:e1:11:35 10.100.0.4
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 kernel: tapbfbbaed4-8d (unregistering): left promiscuous mode
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00128|binding|INFO|Setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab ovn-installed in OVS
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00129|binding|INFO|Setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab up in Southbound
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.578 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:11:35 10.100.0.4'], port_security=['fa:16:3e:e1:11:35 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=bfbbaed4-8dd4-4ee3-bc31-049983ebccab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.580 103246 INFO neutron.agent.ovn.metadata.agent [-] Port bfbbaed4-8dd4-4ee3-bc31-049983ebccab in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e bound to our chassis#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.581 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00130|binding|INFO|Releasing lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab from this chassis (sb_readonly=1)
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00131|binding|INFO|Removing iface tapbfbbaed4-8d ovn-installed in OVS
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00132|if_status|INFO|Dropped 2 log messages in last 182 seconds (most recently, 182 seconds ago) due to excessive rate
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00133|if_status|INFO|Not setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab down as sb is readonly
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00134|binding|INFO|Releasing lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab from this chassis (sb_readonly=0)
Oct  2 08:07:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:31Z|00135|binding|INFO|Setting lport bfbbaed4-8dd4-4ee3-bc31-049983ebccab down in Southbound
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.596 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:11:35 10.100.0.4'], port_security=['fa:16:3e:e1:11:35 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '661e118d-4849-4ccd-a03f-a0edb70948ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=bfbbaed4-8dd4-4ee3-bc31-049983ebccab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.595 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d60c25f5-1bde-4721-aeba-827c5aed2ed7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.634 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e426ea91-5afd-4f2b-8ddb-c1cd9ba74e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.636 2 INFO nova.virt.libvirt.driver [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Instance destroyed successfully.#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.636 2 DEBUG nova.objects.instance [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'resources' on Instance uuid 661e118d-4849-4ccd-a03f-a0edb70948ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.637 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2c11dcf3-7c28-448f-9ad3-46ea3f85969c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.660 2 DEBUG nova.virt.libvirt.vif [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-64695182',display_name='tempest-ServersAdminTestJSON-server-64695182',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-64695182',id=36,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:06:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-kv2dupf6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:06:36Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=661e118d-4849-4ccd-a03f-a0edb70948ea,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.661 2 DEBUG nova.network.os_vif_util [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "address": "fa:16:3e:e1:11:35", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfbbaed4-8d", "ovs_interfaceid": "bfbbaed4-8dd4-4ee3-bc31-049983ebccab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.661 2 DEBUG nova.network.os_vif_util [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.662 2 DEBUG os_vif [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfbbaed4-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.669 2 INFO os_vif [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:11:35,bridge_name='br-int',has_traffic_filtering=True,id=bfbbaed4-8dd4-4ee3-bc31-049983ebccab,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfbbaed4-8d')#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.670 2 INFO nova.virt.libvirt.driver [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Deleting instance files /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea_del#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.670 2 INFO nova.virt.libvirt.driver [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Deletion of /var/lib/nova/instances/661e118d-4849-4ccd-a03f-a0edb70948ea_del complete#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.671 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebbfb08-2576-463f-b287-3519d98e29b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.699 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[50343865-7384-4b92-84c4-5d6633ad7a07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 1000, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 1000, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224916, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.719 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[20aa81ec-6b81-47e2-b31b-1fae88d308e4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224917, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224917, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.722 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.727 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.727 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.728 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.729 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.731 103246 INFO neutron.agent.ovn.metadata.agent [-] Port bfbbaed4-8dd4-4ee3-bc31-049983ebccab in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e unbound from our chassis#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.733 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.748 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[47a8fd53-e1b7-4bcf-81ef-e1cce553d8d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.768 2 INFO nova.compute.manager [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.769 2 DEBUG oslo.service.loopingcall [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.769 2 DEBUG nova.compute.manager [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.769 2 DEBUG nova.network.neutron [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.774 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "365f39ba-80db-4de0-ad55-45b007ea1c04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.775 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "365f39ba-80db-4de0-ad55-45b007ea1c04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.775 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "365f39ba-80db-4de0-ad55-45b007ea1c04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.775 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "365f39ba-80db-4de0-ad55-45b007ea1c04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.775 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "365f39ba-80db-4de0-ad55-45b007ea1c04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.786 2 INFO nova.compute.manager [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Terminating instance#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.794 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[16519a20-00b2-414e-bee2-5b02bda787a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.795 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "refresh_cache-365f39ba-80db-4de0-ad55-45b007ea1c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.795 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquired lock "refresh_cache-365f39ba-80db-4de0-ad55-45b007ea1c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.796 2 DEBUG nova.network.neutron [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.798 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0476c2e9-124d-4ee7-8845-28b24d27fee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.827 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e297eb-42f0-4508-a7d5-40c014147b26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.843 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[782e6442-5a40-4b7f-aaed-d1b5ba9cf8b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66b5a7c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:7b:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 1000, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 1000, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477047, 'reachable_time': 40640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224923, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.856 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[18f7f76b-db0b-405f-9f01-3fbf5376327d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477059, 'tstamp': 477059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224924, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66b5a7c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477063, 'tstamp': 477063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224924, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.858 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.885 2 DEBUG nova.compute.manager [req-e784a991-e7fb-4ea7-a8fe-ffbd0635ed98 req-38df611a-e5a9-4a0b-bfc5-f226068a80ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-unplugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.885 2 DEBUG oslo_concurrency.lockutils [req-e784a991-e7fb-4ea7-a8fe-ffbd0635ed98 req-38df611a-e5a9-4a0b-bfc5-f226068a80ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.885 2 DEBUG oslo_concurrency.lockutils [req-e784a991-e7fb-4ea7-a8fe-ffbd0635ed98 req-38df611a-e5a9-4a0b-bfc5-f226068a80ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.886 2 DEBUG oslo_concurrency.lockutils [req-e784a991-e7fb-4ea7-a8fe-ffbd0635ed98 req-38df611a-e5a9-4a0b-bfc5-f226068a80ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.886 2 DEBUG nova.compute.manager [req-e784a991-e7fb-4ea7-a8fe-ffbd0635ed98 req-38df611a-e5a9-4a0b-bfc5-f226068a80ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] No waiting events found dispatching network-vif-unplugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.886 2 DEBUG nova.compute.manager [req-e784a991-e7fb-4ea7-a8fe-ffbd0635ed98 req-38df611a-e5a9-4a0b-bfc5-f226068a80ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-unplugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 nova_compute[192063]: 2025-10-02 12:07:31.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.896 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b5a7c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.896 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.897 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66b5a7c3-f0, col_values=(('external_ids', {'iface-id': 'a0163170-212d-4aba-9028-3d5fb4d45c5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:31.897 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.171 2 DEBUG nova.network.neutron [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:32 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.495 2 DEBUG nova.network.neutron [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.525 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Releasing lock "refresh_cache-365f39ba-80db-4de0-ad55-45b007ea1c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.526 2 DEBUG nova.compute.manager [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.566 2 DEBUG nova.network.neutron [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:32 np0005466012 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000028.scope: Deactivated successfully.
Oct  2 08:07:32 np0005466012 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000028.scope: Consumed 13.324s CPU time.
Oct  2 08:07:32 np0005466012 systemd-machined[152114]: Machine qemu-18-instance-00000028 terminated.
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.582 2 INFO nova.compute.manager [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Took 0.81 seconds to deallocate network for instance.#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.652 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.653 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.776 2 DEBUG nova.compute.provider_tree [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.782 2 DEBUG nova.compute.manager [req-88481c4e-10db-4bf3-8cb0-0319a0cceb03 req-13265fe2-9b4b-4371-8c48-0ffe8708adce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-deleted-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.784 2 INFO nova.virt.libvirt.driver [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Instance destroyed successfully.#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.784 2 DEBUG nova.objects.instance [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lazy-loading 'resources' on Instance uuid 365f39ba-80db-4de0-ad55-45b007ea1c04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.800 2 INFO nova.virt.libvirt.driver [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Deleting instance files /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04_del#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.801 2 INFO nova.virt.libvirt.driver [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Deletion of /var/lib/nova/instances/365f39ba-80db-4de0-ad55-45b007ea1c04_del complete#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.807 2 DEBUG nova.scheduler.client.report [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.845 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.884 2 INFO nova.scheduler.client.report [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Deleted allocations for instance 661e118d-4849-4ccd-a03f-a0edb70948ea#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.902 2 INFO nova.compute.manager [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.902 2 DEBUG oslo.service.loopingcall [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.903 2 DEBUG nova.compute.manager [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.903 2 DEBUG nova.network.neutron [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:07:32 np0005466012 nova_compute[192063]: 2025-10-02 12:07:32.967 2 DEBUG oslo_concurrency.lockutils [None req-be808bfc-9b0b-430f-a7d1-c72c753e1eec 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.118 2 DEBUG nova.network.neutron [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.133 2 DEBUG nova.network.neutron [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.145 2 INFO nova.compute.manager [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Took 0.24 seconds to deallocate network for instance.#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.266 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.266 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.343 2 DEBUG nova.compute.provider_tree [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.362 2 DEBUG nova.scheduler.client.report [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.395 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.428 2 INFO nova.scheduler.client.report [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Deleted allocations for instance 365f39ba-80db-4de0-ad55-45b007ea1c04#033[00m
Oct  2 08:07:33 np0005466012 nova_compute[192063]: 2025-10-02 12:07:33.519 2 DEBUG oslo_concurrency.lockutils [None req-bea320bf-d219-4451-b847-d947415ed10c c6a7a530a085472d8ace0b41fc888e26 8993ff2640584165964db6af518beb94 - - default default] Lock "365f39ba-80db-4de0-ad55-45b007ea1c04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.159 2 DEBUG nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.162 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.163 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.164 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.164 2 DEBUG nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] No waiting events found dispatching network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.165 2 WARNING nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received unexpected event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.165 2 DEBUG nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.166 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.166 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.166 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.167 2 DEBUG nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] No waiting events found dispatching network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.167 2 WARNING nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received unexpected event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.168 2 DEBUG nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.168 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.169 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.169 2 DEBUG oslo_concurrency.lockutils [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "661e118d-4849-4ccd-a03f-a0edb70948ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.169 2 DEBUG nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] No waiting events found dispatching network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.169 2 WARNING nova.compute.manager [req-dffa0b5d-b65c-43b0-8d96-0ac088f9f6cc req-4a64e6dd-f399-4252-b2cb-4397c0322cbd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Received unexpected event network-vif-plugged-bfbbaed4-8dd4-4ee3-bc31-049983ebccab for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:07:34 np0005466012 podman[224934]: 2025-10-02 12:07:34.172504361 +0000 UTC m=+0.085953849 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:07:34 np0005466012 nova_compute[192063]: 2025-10-02 12:07:34.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.285 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.286 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.287 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.287 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.287 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.299 2 INFO nova.compute.manager [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Terminating instance#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.312 2 DEBUG nova.compute.manager [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:07:35 np0005466012 kernel: tap68972cf3-17 (unregistering): left promiscuous mode
Oct  2 08:07:35 np0005466012 NetworkManager[51207]: <info>  [1759406855.3331] device (tap68972cf3-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:07:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:35Z|00136|binding|INFO|Releasing lport 68972cf3-172b-44a5-b096-f87fe9193518 from this chassis (sb_readonly=0)
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:35Z|00137|binding|INFO|Setting lport 68972cf3-172b-44a5-b096-f87fe9193518 down in Southbound
Oct  2 08:07:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:07:35Z|00138|binding|INFO|Removing iface tap68972cf3-17 ovn-installed in OVS
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.398 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:05:83 10.100.0.7'], port_security=['fa:16:3e:da:05:83 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db3f04a20fd740c1af3139196dc928d2', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c69e6497-c2d4-4cc0-a1d9-2c5055cc5d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dc739b2-072d-4dd4-b9d2-9724145d12f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=68972cf3-172b-44a5-b096-f87fe9193518) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.399 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 68972cf3-172b-44a5-b096-f87fe9193518 in datapath 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e unbound from our chassis#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.400 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.401 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[90d47895-2fd4-45be-a086-8e89c5487d11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.401 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e namespace which is not needed anymore#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000021.scope: Deactivated successfully.
Oct  2 08:07:35 np0005466012 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000021.scope: Consumed 11.482s CPU time.
Oct  2 08:07:35 np0005466012 systemd-machined[152114]: Machine qemu-20-instance-00000021 terminated.
Oct  2 08:07:35 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [NOTICE]   (224076) : haproxy version is 2.8.14-c23fe91
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [NOTICE]   (224076) : path to executable is /usr/sbin/haproxy
Oct  2 08:07:35 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [WARNING]  (224076) : Exiting Master process...
Oct  2 08:07:35 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [ALERT]    (224076) : Current worker (224078) exited with code 143 (Terminated)
Oct  2 08:07:35 np0005466012 neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e[224072]: [WARNING]  (224076) : All workers exited. Exiting... (0)
Oct  2 08:07:35 np0005466012 systemd[1]: libpod-a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0.scope: Deactivated successfully.
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 podman[224979]: 2025-10-02 12:07:35.544391165 +0000 UTC m=+0.055967780 container died a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.570 2 INFO nova.virt.libvirt.driver [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Instance destroyed successfully.#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.571 2 DEBUG nova.objects.instance [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lazy-loading 'resources' on Instance uuid 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:07:35 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0-userdata-shm.mount: Deactivated successfully.
Oct  2 08:07:35 np0005466012 systemd[1]: var-lib-containers-storage-overlay-2f47385ad72212b6b2eab994461dcb053e7b499e47fd6c36349c32c4eaca8e7a-merged.mount: Deactivated successfully.
Oct  2 08:07:35 np0005466012 podman[224979]: 2025-10-02 12:07:35.585598556 +0000 UTC m=+0.097175161 container cleanup a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:07:35 np0005466012 systemd[1]: libpod-conmon-a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0.scope: Deactivated successfully.
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.609 2 DEBUG nova.virt.libvirt.vif [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1814602736',display_name='tempest-ServersAdminTestJSON-server-1814602736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1814602736',id=33,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:07:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='db3f04a20fd740c1af3139196dc928d2',ramdisk_id='',reservation_id='r-q2z1ng2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1782354187',owner_user_name='tempest-ServersAdminTestJSON-1782354187-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:07:28Z,user_data=None,user_id='9258efa4511c4bb3813eca27b75b1008',uuid=4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.610 2 DEBUG nova.network.os_vif_util [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converting VIF {"id": "68972cf3-172b-44a5-b096-f87fe9193518", "address": "fa:16:3e:da:05:83", "network": {"id": "66b5a7c3-fe3e-42b0-aea6-19534bca6e0e", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1726703238-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "db3f04a20fd740c1af3139196dc928d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68972cf3-17", "ovs_interfaceid": "68972cf3-172b-44a5-b096-f87fe9193518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.611 2 DEBUG nova.network.os_vif_util [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.611 2 DEBUG os_vif [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.613 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68972cf3-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.620 2 INFO os_vif [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:05:83,bridge_name='br-int',has_traffic_filtering=True,id=68972cf3-172b-44a5-b096-f87fe9193518,network=Network(66b5a7c3-fe3e-42b0-aea6-19534bca6e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68972cf3-17')#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.621 2 INFO nova.virt.libvirt.driver [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deleting instance files /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80_del#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.622 2 INFO nova.virt.libvirt.driver [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deletion of /var/lib/nova/instances/4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80_del complete#033[00m
Oct  2 08:07:35 np0005466012 podman[225023]: 2025-10-02 12:07:35.652019908 +0000 UTC m=+0.042008874 container remove a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.658 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb8cad6-5cd7-4ad6-860e-3d2285fd4675]: (4, ('Thu Oct  2 12:07:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e (a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0)\na6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0\nThu Oct  2 12:07:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e (a6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0)\na6749da2a448df58203d50b455e175d2d468956b65ec84f6d27e9e11fe37d2d0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.660 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[99b41e2f-3c66-48d5-9aac-7173a2fb71c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.660 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66b5a7c3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:07:35 np0005466012 kernel: tap66b5a7c3-f0: left promiscuous mode
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.678 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ce08cebe-9a87-4cb9-80e0-9b55ff0103f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.707 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[31a32c7e-f135-486a-b472-c4a3691fb67a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.708 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d81f86d9-ba81-4779-8f52-6f83c4b38e6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.721 2 INFO nova.compute.manager [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.721 2 DEBUG oslo.service.loopingcall [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.722 2 DEBUG nova.compute.manager [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:07:35 np0005466012 nova_compute[192063]: 2025-10-02 12:07:35.722 2 DEBUG nova.network.neutron [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.727 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e593729b-9fcb-4cf1-90da-0b8987c8739e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477039, 'reachable_time': 25134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225038, 'error': None, 'target': 'ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.730 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66b5a7c3-fe3e-42b0-aea6-19534bca6e0e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:07:35 np0005466012 systemd[1]: run-netns-ovnmeta\x2d66b5a7c3\x2dfe3e\x2d42b0\x2daea6\x2d19534bca6e0e.mount: Deactivated successfully.
Oct  2 08:07:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:07:35.730 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[21105e88-9ed1-4396-b63e-f31559cd274d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.024 2 DEBUG nova.compute.manager [req-70a02f2b-326f-49b3-a07e-81a7cb7b4ecc req-a5edd0c0-41d0-4c59-bae4-e63ebb1d4852 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.025 2 DEBUG oslo_concurrency.lockutils [req-70a02f2b-326f-49b3-a07e-81a7cb7b4ecc req-a5edd0c0-41d0-4c59-bae4-e63ebb1d4852 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.026 2 DEBUG oslo_concurrency.lockutils [req-70a02f2b-326f-49b3-a07e-81a7cb7b4ecc req-a5edd0c0-41d0-4c59-bae4-e63ebb1d4852 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.026 2 DEBUG oslo_concurrency.lockutils [req-70a02f2b-326f-49b3-a07e-81a7cb7b4ecc req-a5edd0c0-41d0-4c59-bae4-e63ebb1d4852 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.026 2 DEBUG nova.compute.manager [req-70a02f2b-326f-49b3-a07e-81a7cb7b4ecc req-a5edd0c0-41d0-4c59-bae4-e63ebb1d4852 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.027 2 DEBUG nova.compute.manager [req-70a02f2b-326f-49b3-a07e-81a7cb7b4ecc req-a5edd0c0-41d0-4c59-bae4-e63ebb1d4852 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-unplugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.325 2 DEBUG nova.network.neutron [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.339 2 INFO nova.compute.manager [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Took 0.62 seconds to deallocate network for instance.#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.444 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.445 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.507 2 DEBUG nova.compute.provider_tree [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.523 2 DEBUG nova.scheduler.client.report [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.558 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.583 2 INFO nova.scheduler.client.report [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Deleted allocations for instance 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80#033[00m
Oct  2 08:07:36 np0005466012 nova_compute[192063]: 2025-10-02 12:07:36.648 2 DEBUG oslo_concurrency.lockutils [None req-6d6b0bda-6561-4ead-914a-fbb5820912cd 9258efa4511c4bb3813eca27b75b1008 db3f04a20fd740c1af3139196dc928d2 - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.121 2 DEBUG nova.compute.manager [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.121 2 DEBUG oslo_concurrency.lockutils [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.122 2 DEBUG oslo_concurrency.lockutils [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.122 2 DEBUG oslo_concurrency.lockutils [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.122 2 DEBUG nova.compute.manager [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] No waiting events found dispatching network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.122 2 WARNING nova.compute.manager [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received unexpected event network-vif-plugged-68972cf3-172b-44a5-b096-f87fe9193518 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:07:38 np0005466012 nova_compute[192063]: 2025-10-02 12:07:38.122 2 DEBUG nova.compute.manager [req-c36a67f7-e794-4788-8de9-83d3e66509f7 req-eebe7dbe-3873-4edd-97fa-da68d8293ac1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Received event network-vif-deleted-68972cf3-172b-44a5-b096-f87fe9193518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:07:39 np0005466012 nova_compute[192063]: 2025-10-02 12:07:39.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:39 np0005466012 nova_compute[192063]: 2025-10-02 12:07:39.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:40 np0005466012 nova_compute[192063]: 2025-10-02 12:07:40.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:41 np0005466012 podman[225040]: 2025-10-02 12:07:41.149776629 +0000 UTC m=+0.062755024 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Oct  2 08:07:41 np0005466012 podman[225039]: 2025-10-02 12:07:41.155472752 +0000 UTC m=+0.069343521 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:07:44 np0005466012 nova_compute[192063]: 2025-10-02 12:07:44.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:45 np0005466012 podman[225076]: 2025-10-02 12:07:45.161778113 +0000 UTC m=+0.079050893 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:07:45 np0005466012 podman[225077]: 2025-10-02 12:07:45.161934277 +0000 UTC m=+0.076564655 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:07:45 np0005466012 nova_compute[192063]: 2025-10-02 12:07:45.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:46 np0005466012 nova_compute[192063]: 2025-10-02 12:07:46.636 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406851.634263, 661e118d-4849-4ccd-a03f-a0edb70948ea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:46 np0005466012 nova_compute[192063]: 2025-10-02 12:07:46.636 2 INFO nova.compute.manager [-] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:07:46 np0005466012 nova_compute[192063]: 2025-10-02 12:07:46.657 2 DEBUG nova.compute.manager [None req-833654bd-d853-4761-be3f-391e7a76813d - - - - - -] [instance: 661e118d-4849-4ccd-a03f-a0edb70948ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:47 np0005466012 nova_compute[192063]: 2025-10-02 12:07:47.782 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406852.78043, 365f39ba-80db-4de0-ad55-45b007ea1c04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:47 np0005466012 nova_compute[192063]: 2025-10-02 12:07:47.783 2 INFO nova.compute.manager [-] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:07:47 np0005466012 nova_compute[192063]: 2025-10-02 12:07:47.803 2 DEBUG nova.compute.manager [None req-01bbf2d5-7dbd-43f8-b6f6-33e1343c23b0 - - - - - -] [instance: 365f39ba-80db-4de0-ad55-45b007ea1c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:49 np0005466012 nova_compute[192063]: 2025-10-02 12:07:49.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:50 np0005466012 nova_compute[192063]: 2025-10-02 12:07:50.568 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406855.5679805, 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:07:50 np0005466012 nova_compute[192063]: 2025-10-02 12:07:50.569 2 INFO nova.compute.manager [-] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:07:50 np0005466012 nova_compute[192063]: 2025-10-02 12:07:50.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:50 np0005466012 nova_compute[192063]: 2025-10-02 12:07:50.675 2 DEBUG nova.compute.manager [None req-f7e40302-5f25-46b7-bc3d-1a2dea06f23f - - - - - -] [instance: 4e9e4bd0-b4e8-4cd1-94e4-d44c04c3ad80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:07:54 np0005466012 nova_compute[192063]: 2025-10-02 12:07:54.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:55 np0005466012 nova_compute[192063]: 2025-10-02 12:07:55.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:07:59 np0005466012 podman[225120]: 2025-10-02 12:07:59.199703244 +0000 UTC m=+0.105027784 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:07:59 np0005466012 podman[225121]: 2025-10-02 12:07:59.247646057 +0000 UTC m=+0.147576861 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Oct  2 08:07:59 np0005466012 nova_compute[192063]: 2025-10-02 12:07:59.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:00 np0005466012 nova_compute[192063]: 2025-10-02 12:08:00.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:02.118 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:02.119 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:02.119 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:02 np0005466012 podman[225169]: 2025-10-02 12:08:02.132448868 +0000 UTC m=+0.050557454 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:08:04 np0005466012 nova_compute[192063]: 2025-10-02 12:08:04.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:05 np0005466012 podman[225189]: 2025-10-02 12:08:05.141603964 +0000 UTC m=+0.062152768 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:08:05 np0005466012 nova_compute[192063]: 2025-10-02 12:08:05.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:09 np0005466012 nova_compute[192063]: 2025-10-02 12:08:09.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:10 np0005466012 nova_compute[192063]: 2025-10-02 12:08:10.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:12 np0005466012 podman[225210]: 2025-10-02 12:08:12.172418294 +0000 UTC m=+0.079918737 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:08:12 np0005466012 podman[225211]: 2025-10-02 12:08:12.182496285 +0000 UTC m=+0.089975868 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:08:12 np0005466012 nova_compute[192063]: 2025-10-02 12:08:12.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:12 np0005466012 nova_compute[192063]: 2025-10-02 12:08:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:12 np0005466012 nova_compute[192063]: 2025-10-02 12:08:12.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:12 np0005466012 nova_compute[192063]: 2025-10-02 12:08:12.844 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:12Z|00139|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:08:13 np0005466012 nova_compute[192063]: 2025-10-02 12:08:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:14.803 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:14.804 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:08:14 np0005466012 nova_compute[192063]: 2025-10-02 12:08:14.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:14 np0005466012 nova_compute[192063]: 2025-10-02 12:08:14.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:14 np0005466012 nova_compute[192063]: 2025-10-02 12:08:14.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:15 np0005466012 podman[225248]: 2025-10-02 12:08:15.713109646 +0000 UTC m=+0.051940702 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:08:15 np0005466012 podman[225247]: 2025-10-02 12:08:15.714453232 +0000 UTC m=+0.057261605 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.878 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.878 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.879 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:15 np0005466012 nova_compute[192063]: 2025-10-02 12:08:15.879 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.039 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.040 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.4281120300293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.040 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.041 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.507 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.508 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.616 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.697 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.776 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:08:16 np0005466012 nova_compute[192063]: 2025-10-02 12:08:16.776 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:19 np0005466012 nova_compute[192063]: 2025-10-02 12:08:19.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:20 np0005466012 nova_compute[192063]: 2025-10-02 12:08:20.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:20.806 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:22 np0005466012 nova_compute[192063]: 2025-10-02 12:08:22.777 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:08:22 np0005466012 nova_compute[192063]: 2025-10-02 12:08:22.777 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:08:22 np0005466012 nova_compute[192063]: 2025-10-02 12:08:22.891 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:08:24 np0005466012 nova_compute[192063]: 2025-10-02 12:08:24.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:25 np0005466012 nova_compute[192063]: 2025-10-02 12:08:25.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.228 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.228 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.249 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.372 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.372 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.378 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.379 2 INFO nova.compute.claims [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.536 2 DEBUG nova.compute.provider_tree [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.550 2 DEBUG nova.scheduler.client.report [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.575 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.576 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.629 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.630 2 DEBUG nova.network.neutron [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.647 2 INFO nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.669 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.792 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.793 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.794 2 INFO nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Creating image(s)#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.794 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "/var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.795 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "/var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.795 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "/var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.808 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.855 2 DEBUG nova.policy [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '247c672a9b2842c0800bda6dc0c82fcd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '81fba2a656c54f78822b8aee58f6dd5b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.863 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.864 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.865 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.880 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.937 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.938 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.990 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.992 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:26 np0005466012 nova_compute[192063]: 2025-10-02 12:08:26.992 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.083 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.084 2 DEBUG nova.virt.disk.api [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Checking if we can resize image /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.085 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.169 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.171 2 DEBUG nova.virt.disk.api [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Cannot resize image /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.171 2 DEBUG nova.objects.instance [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lazy-loading 'migration_context' on Instance uuid df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.205 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.205 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Ensure instance console log exists: /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.206 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.206 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:27 np0005466012 nova_compute[192063]: 2025-10-02 12:08:27.207 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:28 np0005466012 nova_compute[192063]: 2025-10-02 12:08:28.385 2 DEBUG nova.network.neutron [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Successfully created port: 02499d0d-7019-4582-bbf9-13838dde8102 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.220 2 DEBUG nova.network.neutron [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Successfully updated port: 02499d0d-7019-4582-bbf9-13838dde8102 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.241 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.242 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquired lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.243 2 DEBUG nova.network.neutron [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.283 2 DEBUG nova.compute.manager [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-changed-02499d0d-7019-4582-bbf9-13838dde8102 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.284 2 DEBUG nova.compute.manager [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Refreshing instance network info cache due to event network-changed-02499d0d-7019-4582-bbf9-13838dde8102. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.285 2 DEBUG oslo_concurrency.lockutils [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.428 2 DEBUG nova.network.neutron [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:08:29 np0005466012 nova_compute[192063]: 2025-10-02 12:08:29.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:30 np0005466012 podman[225306]: 2025-10-02 12:08:30.190512245 +0000 UTC m=+0.094089719 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:08:30 np0005466012 podman[225307]: 2025-10-02 12:08:30.213812694 +0000 UTC m=+0.110020109 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.348 2 DEBUG nova.network.neutron [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updating instance_info_cache with network_info: [{"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.372 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Releasing lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.373 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Instance network_info: |[{"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.374 2 DEBUG oslo_concurrency.lockutils [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.374 2 DEBUG nova.network.neutron [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Refreshing network info cache for port 02499d0d-7019-4582-bbf9-13838dde8102 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.380 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Start _get_guest_xml network_info=[{"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.387 2 WARNING nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.396 2 DEBUG nova.virt.libvirt.host [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.396 2 DEBUG nova.virt.libvirt.host [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.400 2 DEBUG nova.virt.libvirt.host [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.401 2 DEBUG nova.virt.libvirt.host [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.402 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.403 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.403 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.403 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.404 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.404 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.404 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.405 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.405 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.405 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.405 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.406 2 DEBUG nova.virt.hardware [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.410 2 DEBUG nova.virt.libvirt.vif [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1609757405',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1609757405',id=45,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='81fba2a656c54f78822b8aee58f6dd5b',ramdisk_id='',reservation_id='r-pc96sxvm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-906275440',owner_user_name='tempest-AttachInterfacesV270Test-906275440-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:08:26Z,user_data=None,user_id='247c672a9b2842c0800bda6dc0c82fcd',uuid=df519aaf-a62d-4f62-a5ab-3bd7f485b3b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.410 2 DEBUG nova.network.os_vif_util [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converting VIF {"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.411 2 DEBUG nova.network.os_vif_util [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.412 2 DEBUG nova.objects.instance [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lazy-loading 'pci_devices' on Instance uuid df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.426 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <uuid>df519aaf-a62d-4f62-a5ab-3bd7f485b3b6</uuid>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <name>instance-0000002d</name>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:name>tempest-AttachInterfacesV270Test-server-1609757405</nova:name>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:08:30</nova:creationTime>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:user uuid="247c672a9b2842c0800bda6dc0c82fcd">tempest-AttachInterfacesV270Test-906275440-project-member</nova:user>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:project uuid="81fba2a656c54f78822b8aee58f6dd5b">tempest-AttachInterfacesV270Test-906275440</nova:project>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        <nova:port uuid="02499d0d-7019-4582-bbf9-13838dde8102">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <entry name="serial">df519aaf-a62d-4f62-a5ab-3bd7f485b3b6</entry>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <entry name="uuid">df519aaf-a62d-4f62-a5ab-3bd7f485b3b6</entry>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.config"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0a:22:1f"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <target dev="tap02499d0d-70"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/console.log" append="off"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:08:30 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:08:30 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:08:30 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:08:30 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.428 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Preparing to wait for external event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.428 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.428 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.429 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.430 2 DEBUG nova.virt.libvirt.vif [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1609757405',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1609757405',id=45,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='81fba2a656c54f78822b8aee58f6dd5b',ramdisk_id='',reservation_id='r-pc96sxvm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-906275440',owner_user_name='tempest-AttachInterfacesV270Test-906275440-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:08:26Z,user_data=None,user_id='247c672a9b2842c0800bda6dc0c82fcd',uuid=df519aaf-a62d-4f62-a5ab-3bd7f485b3b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.430 2 DEBUG nova.network.os_vif_util [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converting VIF {"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.431 2 DEBUG nova.network.os_vif_util [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.432 2 DEBUG os_vif [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.433 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.434 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.440 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02499d0d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.440 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02499d0d-70, col_values=(('external_ids', {'iface-id': '02499d0d-7019-4582-bbf9-13838dde8102', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:22:1f', 'vm-uuid': 'df519aaf-a62d-4f62-a5ab-3bd7f485b3b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:30 np0005466012 NetworkManager[51207]: <info>  [1759406910.4436] manager: (tap02499d0d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.453 2 INFO os_vif [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70')#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.502 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.502 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.502 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No VIF found with MAC fa:16:3e:0a:22:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:08:30 np0005466012 nova_compute[192063]: 2025-10-02 12:08:30.503 2 INFO nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Using config drive#033[00m
Oct  2 08:08:31 np0005466012 nova_compute[192063]: 2025-10-02 12:08:31.910 2 INFO nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Creating config drive at /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.config#033[00m
Oct  2 08:08:31 np0005466012 nova_compute[192063]: 2025-10-02 12:08:31.919 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcbda9hyg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.054 2 DEBUG oslo_concurrency.processutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcbda9hyg" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:32 np0005466012 kernel: tap02499d0d-70: entered promiscuous mode
Oct  2 08:08:32 np0005466012 NetworkManager[51207]: <info>  [1759406912.1652] manager: (tap02499d0d-70): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct  2 08:08:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:32Z|00140|binding|INFO|Claiming lport 02499d0d-7019-4582-bbf9-13838dde8102 for this chassis.
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:32Z|00141|binding|INFO|02499d0d-7019-4582-bbf9-13838dde8102: Claiming fa:16:3e:0a:22:1f 10.100.0.3
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.202 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:22:1f 10.100.0.3'], port_security=['fa:16:3e:0a:22:1f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'df519aaf-a62d-4f62-a5ab-3bd7f485b3b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '81fba2a656c54f78822b8aee58f6dd5b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '67584b86-2dca-471f-a9c3-61260b3868de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6125bc9b-c369-4c54-a9b0-13fafb19055f, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=02499d0d-7019-4582-bbf9-13838dde8102) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.203 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 02499d0d-7019-4582-bbf9-13838dde8102 in datapath 200c8157-1b5e-41ff-b9b2-0e661c71b846 bound to our chassis#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.205 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 200c8157-1b5e-41ff-b9b2-0e661c71b846#033[00m
Oct  2 08:08:32 np0005466012 systemd-udevd[225389]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:08:32 np0005466012 systemd-machined[152114]: New machine qemu-21-instance-0000002d.
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.218 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b29f6066-a9d9-4a43-a717-73c393d30e6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.219 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap200c8157-11 in ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:08:32 np0005466012 NetworkManager[51207]: <info>  [1759406912.2234] device (tap02499d0d-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:08:32 np0005466012 NetworkManager[51207]: <info>  [1759406912.2244] device (tap02499d0d-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.226 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap200c8157-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.226 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a3bed556-c7a1-4d86-a4c5-e93c626a0f30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.228 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5db36c0c-422b-4b21-b072-f77e6e59e441]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 systemd[1]: Started Virtual Machine qemu-21-instance-0000002d.
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.241 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2463ca-4b50-4778-ab98-699211158002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:32Z|00142|binding|INFO|Setting lport 02499d0d-7019-4582-bbf9-13838dde8102 ovn-installed in OVS
Oct  2 08:08:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:32Z|00143|binding|INFO|Setting lport 02499d0d-7019-4582-bbf9-13838dde8102 up in Southbound
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.254 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5b740fa6-edd5-491c-8b00-cfbddfba2a98]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 podman[225368]: 2025-10-02 12:08:32.269766109 +0000 UTC m=+0.097704117 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.283 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6d13ef07-3a22-48cf-afa1-54b2223b580d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.288 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f6125ea2-c42e-4737-ac35-468a8faa1db3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 NetworkManager[51207]: <info>  [1759406912.2899] manager: (tap200c8157-10): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Oct  2 08:08:32 np0005466012 systemd-udevd[225396]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.320 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b9a1e5-bcca-486d-abd4-70fbfd6349c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.325 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4740e692-8ac7-458e-bc84-f1034218ed02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 NetworkManager[51207]: <info>  [1759406912.3560] device (tap200c8157-10): carrier: link connected
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.362 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1384ff-175d-4e50-abb1-3eccd532e818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.382 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9c38ce-232c-49e6-ab5e-fc43f82ece78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap200c8157-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:62:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490596, 'reachable_time': 21689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225428, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.405 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ad64c8-acf3-46e6-819c-f7ddc63652f4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:6254'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490596, 'tstamp': 490596}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225429, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.429 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd892b8-1fc4-4187-a7ff-079a71a3a339]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap200c8157-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:62:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490596, 'reachable_time': 21689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225430, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.470 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ebd685-964b-4744-bc62-0f9050345996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.551 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3e487fd4-bc3e-4ad2-9e88-e5205e1f6e9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.552 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap200c8157-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.552 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.553 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap200c8157-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.601 2 DEBUG nova.compute.manager [req-144b3493-dfc4-40e8-80d6-cb46ce7eccfe req-770a2ed8-368c-4e90-b1e1-8f947eafc115 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.601 2 DEBUG oslo_concurrency.lockutils [req-144b3493-dfc4-40e8-80d6-cb46ce7eccfe req-770a2ed8-368c-4e90-b1e1-8f947eafc115 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.602 2 DEBUG oslo_concurrency.lockutils [req-144b3493-dfc4-40e8-80d6-cb46ce7eccfe req-770a2ed8-368c-4e90-b1e1-8f947eafc115 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.602 2 DEBUG oslo_concurrency.lockutils [req-144b3493-dfc4-40e8-80d6-cb46ce7eccfe req-770a2ed8-368c-4e90-b1e1-8f947eafc115 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.602 2 DEBUG nova.compute.manager [req-144b3493-dfc4-40e8-80d6-cb46ce7eccfe req-770a2ed8-368c-4e90-b1e1-8f947eafc115 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Processing event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 kernel: tap200c8157-10: entered promiscuous mode
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 NetworkManager[51207]: <info>  [1759406912.6070] manager: (tap200c8157-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.612 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap200c8157-10, col_values=(('external_ids', {'iface-id': '4ca70a23-7dcc-4689-aa7a-3a84be460635'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:32Z|00144|binding|INFO|Releasing lport 4ca70a23-7dcc-4689-aa7a-3a84be460635 from this chassis (sb_readonly=0)
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.621 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/200c8157-1b5e-41ff-b9b2-0e661c71b846.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/200c8157-1b5e-41ff-b9b2-0e661c71b846.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.622 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c509364c-3049-467f-8087-228e87a1f551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.623 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-200c8157-1b5e-41ff-b9b2-0e661c71b846
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/200c8157-1b5e-41ff-b9b2-0e661c71b846.pid.haproxy
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 200c8157-1b5e-41ff-b9b2-0e661c71b846
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:08:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:32.624 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'env', 'PROCESS_TAG=haproxy-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/200c8157-1b5e-41ff-b9b2-0e661c71b846.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:08:32 np0005466012 nova_compute[192063]: 2025-10-02 12:08:32.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:33 np0005466012 podman[225468]: 2025-10-02 12:08:32.956334037 +0000 UTC m=+0.023034152 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:08:33 np0005466012 podman[225468]: 2025-10-02 12:08:33.278763824 +0000 UTC m=+0.345463909 container create a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.340 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.341 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406913.3400111, df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.342 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] VM Started (Lifecycle Event)#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.345 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.349 2 INFO nova.virt.libvirt.driver [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Instance spawned successfully.#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.349 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.368 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.373 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.376 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.377 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.377 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.377 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.378 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.378 2 DEBUG nova.virt.libvirt.driver [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.400 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.400 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406913.340201, df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.400 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.429 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.436 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406913.344508, df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.436 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:08:33 np0005466012 systemd[1]: Started libpod-conmon-a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4.scope.
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.468 2 INFO nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Took 6.68 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.469 2 DEBUG nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.473 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.480 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:08:33 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:08:33 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c177b64f1fbbba8d352527d0ca8b0f2cf7ec1662ebffa698015548b4b246a70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.516 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.595 2 DEBUG nova.network.neutron [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updated VIF entry in instance network info cache for port 02499d0d-7019-4582-bbf9-13838dde8102. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.596 2 DEBUG nova.network.neutron [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updating instance_info_cache with network_info: [{"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:33 np0005466012 podman[225468]: 2025-10-02 12:08:33.629607117 +0000 UTC m=+0.696307242 container init a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:08:33 np0005466012 podman[225468]: 2025-10-02 12:08:33.635739442 +0000 UTC m=+0.702439557 container start a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:08:33 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [NOTICE]   (225487) : New worker (225489) forked
Oct  2 08:08:33 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [NOTICE]   (225487) : Loading success.
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.706 2 DEBUG oslo_concurrency.lockutils [req-2379e414-3bb9-4280-afb9-bc166dc28d6a req-ce445916-4734-4a59-a525-d97e742d74b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.727 2 INFO nova.compute.manager [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Took 7.39 seconds to build instance.#033[00m
Oct  2 08:08:33 np0005466012 nova_compute[192063]: 2025-10-02 12:08:33.751 2 DEBUG oslo_concurrency.lockutils [None req-f50182c2-811e-4b19-b3fe-db48efd08f8d 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:34 np0005466012 nova_compute[192063]: 2025-10-02 12:08:34.697 2 DEBUG nova.compute.manager [req-78a9c2c2-4d05-42f4-a120-d09934898161 req-bcb90a82-6619-44bc-b4cf-77f88e9d33f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:34 np0005466012 nova_compute[192063]: 2025-10-02 12:08:34.697 2 DEBUG oslo_concurrency.lockutils [req-78a9c2c2-4d05-42f4-a120-d09934898161 req-bcb90a82-6619-44bc-b4cf-77f88e9d33f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:34 np0005466012 nova_compute[192063]: 2025-10-02 12:08:34.697 2 DEBUG oslo_concurrency.lockutils [req-78a9c2c2-4d05-42f4-a120-d09934898161 req-bcb90a82-6619-44bc-b4cf-77f88e9d33f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:34 np0005466012 nova_compute[192063]: 2025-10-02 12:08:34.698 2 DEBUG oslo_concurrency.lockutils [req-78a9c2c2-4d05-42f4-a120-d09934898161 req-bcb90a82-6619-44bc-b4cf-77f88e9d33f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:34 np0005466012 nova_compute[192063]: 2025-10-02 12:08:34.698 2 DEBUG nova.compute.manager [req-78a9c2c2-4d05-42f4-a120-d09934898161 req-bcb90a82-6619-44bc-b4cf-77f88e9d33f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:34 np0005466012 nova_compute[192063]: 2025-10-02 12:08:34.698 2 WARNING nova.compute.manager [req-78a9c2c2-4d05-42f4-a120-d09934898161 req-bcb90a82-6619-44bc-b4cf-77f88e9d33f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received unexpected event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:08:35 np0005466012 nova_compute[192063]: 2025-10-02 12:08:35.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:35 np0005466012 nova_compute[192063]: 2025-10-02 12:08:35.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:36 np0005466012 nova_compute[192063]: 2025-10-02 12:08:36.034 2 DEBUG oslo_concurrency.lockutils [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "interface-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:36 np0005466012 nova_compute[192063]: 2025-10-02 12:08:36.034 2 DEBUG oslo_concurrency.lockutils [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "interface-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:36 np0005466012 nova_compute[192063]: 2025-10-02 12:08:36.034 2 DEBUG nova.objects.instance [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lazy-loading 'flavor' on Instance uuid df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:36 np0005466012 nova_compute[192063]: 2025-10-02 12:08:36.079 2 DEBUG nova.objects.instance [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lazy-loading 'pci_requests' on Instance uuid df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:36 np0005466012 nova_compute[192063]: 2025-10-02 12:08:36.090 2 DEBUG nova.network.neutron [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:08:36 np0005466012 podman[225498]: 2025-10-02 12:08:36.154884182 +0000 UTC m=+0.065986502 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:08:36 np0005466012 nova_compute[192063]: 2025-10-02 12:08:36.472 2 DEBUG nova.policy [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '247c672a9b2842c0800bda6dc0c82fcd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '81fba2a656c54f78822b8aee58f6dd5b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:08:38 np0005466012 nova_compute[192063]: 2025-10-02 12:08:38.644 2 DEBUG nova.network.neutron [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Successfully created port: 49930c33-a017-4f19-9070-b61a21d3819b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:08:39 np0005466012 nova_compute[192063]: 2025-10-02 12:08:39.675 2 DEBUG nova.network.neutron [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Successfully updated port: 49930c33-a017-4f19-9070-b61a21d3819b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:08:39 np0005466012 nova_compute[192063]: 2025-10-02 12:08:39.708 2 DEBUG oslo_concurrency.lockutils [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:39 np0005466012 nova_compute[192063]: 2025-10-02 12:08:39.708 2 DEBUG oslo_concurrency.lockutils [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquired lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:39 np0005466012 nova_compute[192063]: 2025-10-02 12:08:39.708 2 DEBUG nova.network.neutron [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:08:40 np0005466012 nova_compute[192063]: 2025-10-02 12:08:40.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:40 np0005466012 nova_compute[192063]: 2025-10-02 12:08:40.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:40 np0005466012 nova_compute[192063]: 2025-10-02 12:08:40.505 2 WARNING nova.network.neutron [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] 200c8157-1b5e-41ff-b9b2-0e661c71b846 already exists in list: networks containing: ['200c8157-1b5e-41ff-b9b2-0e661c71b846']. ignoring it#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.897 2 DEBUG nova.network.neutron [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updating instance_info_cache with network_info: [{"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.925 2 DEBUG oslo_concurrency.lockutils [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Releasing lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.929 2 DEBUG nova.virt.libvirt.vif [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1609757405',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1609757405',id=45,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:08:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='81fba2a656c54f78822b8aee58f6dd5b',ramdisk_id='',reservation_id='r-pc96sxvm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-906275440',owner_user_name='tempest-AttachInterfacesV270Test-906275440-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:08:33Z,user_data=None,user_id='247c672a9b2842c0800bda6dc0c82fcd',uuid=df519aaf-a62d-4f62-a5ab-3bd7f485b3b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.930 2 DEBUG nova.network.os_vif_util [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converting VIF {"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.931 2 DEBUG nova.network.os_vif_util [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.931 2 DEBUG os_vif [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.932 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.932 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.935 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49930c33-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.935 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap49930c33-a0, col_values=(('external_ids', {'iface-id': '49930c33-a017-4f19-9070-b61a21d3819b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:30:8e', 'vm-uuid': 'df519aaf-a62d-4f62-a5ab-3bd7f485b3b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:42 np0005466012 NetworkManager[51207]: <info>  [1759406922.9392] manager: (tap49930c33-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.945 2 INFO os_vif [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0')#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.946 2 DEBUG nova.virt.libvirt.vif [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1609757405',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1609757405',id=45,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:08:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='81fba2a656c54f78822b8aee58f6dd5b',ramdisk_id='',reservation_id='r-pc96sxvm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-906275440',owner_user_name='tempest-AttachInterfacesV270Test-906275440-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:08:33Z,user_data=None,user_id='247c672a9b2842c0800bda6dc0c82fcd',uuid=df519aaf-a62d-4f62-a5ab-3bd7f485b3b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.946 2 DEBUG nova.network.os_vif_util [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converting VIF {"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.946 2 DEBUG nova.network.os_vif_util [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.948 2 DEBUG nova.virt.libvirt.guest [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:08:42 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:1d:30:8e"/>
Oct  2 08:08:42 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:08:42 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:08:42 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:08:42 np0005466012 nova_compute[192063]:  <target dev="tap49930c33-a0"/>
Oct  2 08:08:42 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:08:42 np0005466012 nova_compute[192063]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:08:42 np0005466012 kernel: tap49930c33-a0: entered promiscuous mode
Oct  2 08:08:42 np0005466012 NetworkManager[51207]: <info>  [1759406922.9639] manager: (tap49930c33-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Oct  2 08:08:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:42Z|00145|binding|INFO|Claiming lport 49930c33-a017-4f19-9070-b61a21d3819b for this chassis.
Oct  2 08:08:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:42Z|00146|binding|INFO|49930c33-a017-4f19-9070-b61a21d3819b: Claiming fa:16:3e:1d:30:8e 10.100.0.12
Oct  2 08:08:42 np0005466012 nova_compute[192063]: 2025-10-02 12:08:42.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:42.988 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:30:8e 10.100.0.12'], port_security=['fa:16:3e:1d:30:8e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'df519aaf-a62d-4f62-a5ab-3bd7f485b3b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '81fba2a656c54f78822b8aee58f6dd5b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '67584b86-2dca-471f-a9c3-61260b3868de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6125bc9b-c369-4c54-a9b0-13fafb19055f, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=49930c33-a017-4f19-9070-b61a21d3819b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:42.990 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 49930c33-a017-4f19-9070-b61a21d3819b in datapath 200c8157-1b5e-41ff-b9b2-0e661c71b846 bound to our chassis#033[00m
Oct  2 08:08:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:42.992 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 200c8157-1b5e-41ff-b9b2-0e661c71b846#033[00m
Oct  2 08:08:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:42Z|00147|binding|INFO|Setting lport 49930c33-a017-4f19-9070-b61a21d3819b ovn-installed in OVS
Oct  2 08:08:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:42Z|00148|binding|INFO|Setting lport 49930c33-a017-4f19-9070-b61a21d3819b up in Southbound
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.014 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc32777-483c-4936-9cc4-e94755173d22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:43 np0005466012 systemd-udevd[225548]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.051 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[eb580bca-b88a-412f-8e1d-b36dc3e32b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.047 2 DEBUG nova.virt.libvirt.driver [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.048 2 DEBUG nova.virt.libvirt.driver [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.048 2 DEBUG nova.virt.libvirt.driver [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No VIF found with MAC fa:16:3e:0a:22:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.048 2 DEBUG nova.virt.libvirt.driver [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] No VIF found with MAC fa:16:3e:1d:30:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:08:43 np0005466012 NetworkManager[51207]: <info>  [1759406923.0537] device (tap49930c33-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:08:43 np0005466012 NetworkManager[51207]: <info>  [1759406923.0550] device (tap49930c33-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.057 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f29931c6-afa8-4cad-90a1-b43fed01eb19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.071 2 DEBUG nova.virt.libvirt.guest [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesV270Test-server-1609757405</nova:name>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:08:43</nova:creationTime>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:user uuid="247c672a9b2842c0800bda6dc0c82fcd">tempest-AttachInterfacesV270Test-906275440-project-member</nova:user>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:project uuid="81fba2a656c54f78822b8aee58f6dd5b">tempest-AttachInterfacesV270Test-906275440</nova:project>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:port uuid="02499d0d-7019-4582-bbf9-13838dde8102">
Oct  2 08:08:43 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    <nova:port uuid="49930c33-a017-4f19-9070-b61a21d3819b">
Oct  2 08:08:43 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:08:43 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:08:43 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:08:43 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.089 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[399f59ba-f07d-4120-ac5d-bb8069e0128b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:43 np0005466012 podman[225521]: 2025-10-02 12:08:43.091967804 +0000 UTC m=+0.093103442 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd)
Oct  2 08:08:43 np0005466012 podman[225523]: 2025-10-02 12:08:43.099516528 +0000 UTC m=+0.091991372 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.101 2 DEBUG oslo_concurrency.lockutils [None req-8098a131-e719-4fe2-b932-89df1bafeb30 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "interface-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.124 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[570b3b78-ea8b-4d59-8ae1-62443a1369ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap200c8157-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:62:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490596, 'reachable_time': 21689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225568, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.142 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a16fa77f-efd6-4492-ba09-c770ee15c7a2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap200c8157-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490611, 'tstamp': 490611}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225569, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap200c8157-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490615, 'tstamp': 490615}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225569, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.144 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap200c8157-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.151 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap200c8157-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.151 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.152 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap200c8157-10, col_values=(('external_ids', {'iface-id': '4ca70a23-7dcc-4689-aa7a-3a84be460635'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:43.152 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.683 2 DEBUG nova.compute.manager [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-changed-49930c33-a017-4f19-9070-b61a21d3819b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.683 2 DEBUG nova.compute.manager [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Refreshing instance network info cache due to event network-changed-49930c33-a017-4f19-9070-b61a21d3819b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.684 2 DEBUG oslo_concurrency.lockutils [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.684 2 DEBUG oslo_concurrency.lockutils [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:08:43 np0005466012 nova_compute[192063]: 2025-10-02 12:08:43.685 2 DEBUG nova.network.neutron [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Refreshing network info cache for port 49930c33-a017-4f19-9070-b61a21d3819b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.014 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.014 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.015 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.015 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.015 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.026 2 INFO nova.compute.manager [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Terminating instance#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.041 2 DEBUG nova.compute.manager [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:08:45 np0005466012 kernel: tap02499d0d-70 (unregistering): left promiscuous mode
Oct  2 08:08:45 np0005466012 NetworkManager[51207]: <info>  [1759406925.0791] device (tap02499d0d-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:45Z|00149|binding|INFO|Releasing lport 02499d0d-7019-4582-bbf9-13838dde8102 from this chassis (sb_readonly=0)
Oct  2 08:08:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:45Z|00150|binding|INFO|Setting lport 02499d0d-7019-4582-bbf9-13838dde8102 down in Southbound
Oct  2 08:08:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:45Z|00151|binding|INFO|Removing iface tap02499d0d-70 ovn-installed in OVS
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.100 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:22:1f 10.100.0.3'], port_security=['fa:16:3e:0a:22:1f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'df519aaf-a62d-4f62-a5ab-3bd7f485b3b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '81fba2a656c54f78822b8aee58f6dd5b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '67584b86-2dca-471f-a9c3-61260b3868de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6125bc9b-c369-4c54-a9b0-13fafb19055f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=02499d0d-7019-4582-bbf9-13838dde8102) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.102 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 02499d0d-7019-4582-bbf9-13838dde8102 in datapath 200c8157-1b5e-41ff-b9b2-0e661c71b846 unbound from our chassis#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.104 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 200c8157-1b5e-41ff-b9b2-0e661c71b846#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 kernel: tap49930c33-a0 (unregistering): left promiscuous mode
Oct  2 08:08:45 np0005466012 NetworkManager[51207]: <info>  [1759406925.1203] device (tap49930c33-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.130 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f44d9c-9220-48d1-99da-617635028177]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:45Z|00152|binding|INFO|Releasing lport 49930c33-a017-4f19-9070-b61a21d3819b from this chassis (sb_readonly=0)
Oct  2 08:08:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:45Z|00153|binding|INFO|Setting lport 49930c33-a017-4f19-9070-b61a21d3819b down in Southbound
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:08:45Z|00154|binding|INFO|Removing iface tap49930c33-a0 ovn-installed in OVS
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.141 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:30:8e 10.100.0.12'], port_security=['fa:16:3e:1d:30:8e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'df519aaf-a62d-4f62-a5ab-3bd7f485b3b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '81fba2a656c54f78822b8aee58f6dd5b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '67584b86-2dca-471f-a9c3-61260b3868de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6125bc9b-c369-4c54-a9b0-13fafb19055f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=49930c33-a017-4f19-9070-b61a21d3819b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.172 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fac573b0-71b2-42a3-bbc9-55fcd5f88124]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.175 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[31042a2b-2f4e-4ded-99a5-21408b1c9aa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Oct  2 08:08:45 np0005466012 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002d.scope: Consumed 12.237s CPU time.
Oct  2 08:08:45 np0005466012 systemd-machined[152114]: Machine qemu-21-instance-0000002d terminated.
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.211 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[aa788eab-b443-45cb-a07a-55f559750bdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.238 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[04fa0ecc-970b-46a8-8e3c-43d7aa31030f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap200c8157-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:62:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490596, 'reachable_time': 21689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225604, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.260 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[22db655a-831f-4543-8d49-601c680932c8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap200c8157-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490611, 'tstamp': 490611}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225605, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap200c8157-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490615, 'tstamp': 490615}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225605, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.262 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap200c8157-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 NetworkManager[51207]: <info>  [1759406925.2712] manager: (tap02499d0d-70): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.271 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap200c8157-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.272 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.272 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap200c8157-10, col_values=(('external_ids', {'iface-id': '4ca70a23-7dcc-4689-aa7a-3a84be460635'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.272 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.274 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 49930c33-a017-4f19-9070-b61a21d3819b in datapath 200c8157-1b5e-41ff-b9b2-0e661c71b846 unbound from our chassis#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.275 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 200c8157-1b5e-41ff-b9b2-0e661c71b846, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.276 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[324759ce-df46-4a02-a313-bb3921a71ec1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.276 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846 namespace which is not needed anymore#033[00m
Oct  2 08:08:45 np0005466012 NetworkManager[51207]: <info>  [1759406925.2842] manager: (tap49930c33-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.326 2 INFO nova.virt.libvirt.driver [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Instance destroyed successfully.#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.326 2 DEBUG nova.objects.instance [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lazy-loading 'resources' on Instance uuid df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.340 2 DEBUG nova.virt.libvirt.vif [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1609757405',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1609757405',id=45,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:08:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='81fba2a656c54f78822b8aee58f6dd5b',ramdisk_id='',reservation_id='r-pc96sxvm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-906275440',owner_user_name='tempest-AttachInterfacesV270Test-906275440-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:08:33Z,user_data=None,user_id='247c672a9b2842c0800bda6dc0c82fcd',uuid=df519aaf-a62d-4f62-a5ab-3bd7f485b3b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.340 2 DEBUG nova.network.os_vif_util [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converting VIF {"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.340 2 DEBUG nova.network.os_vif_util [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.341 2 DEBUG os_vif [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02499d0d-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.350 2 INFO os_vif [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:22:1f,bridge_name='br-int',has_traffic_filtering=True,id=02499d0d-7019-4582-bbf9-13838dde8102,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02499d0d-70')#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.351 2 DEBUG nova.virt.libvirt.vif [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1609757405',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1609757405',id=45,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:08:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='81fba2a656c54f78822b8aee58f6dd5b',ramdisk_id='',reservation_id='r-pc96sxvm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-906275440',owner_user_name='tempest-AttachInterfacesV270Test-906275440-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:08:33Z,user_data=None,user_id='247c672a9b2842c0800bda6dc0c82fcd',uuid=df519aaf-a62d-4f62-a5ab-3bd7f485b3b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.351 2 DEBUG nova.network.os_vif_util [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converting VIF {"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.352 2 DEBUG nova.network.os_vif_util [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.352 2 DEBUG os_vif [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.353 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49930c33-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.357 2 INFO os_vif [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:30:8e,bridge_name='br-int',has_traffic_filtering=True,id=49930c33-a017-4f19-9070-b61a21d3819b,network=Network(200c8157-1b5e-41ff-b9b2-0e661c71b846),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49930c33-a0')#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.357 2 INFO nova.virt.libvirt.driver [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Deleting instance files /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6_del#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.358 2 INFO nova.virt.libvirt.driver [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Deletion of /var/lib/nova/instances/df519aaf-a62d-4f62-a5ab-3bd7f485b3b6_del complete#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.433 2 INFO nova.compute.manager [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:08:45 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [NOTICE]   (225487) : haproxy version is 2.8.14-c23fe91
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.434 2 DEBUG oslo.service.loopingcall [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.435 2 DEBUG nova.compute.manager [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:08:45 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [NOTICE]   (225487) : path to executable is /usr/sbin/haproxy
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.435 2 DEBUG nova.network.neutron [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:08:45 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [WARNING]  (225487) : Exiting Master process...
Oct  2 08:08:45 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [ALERT]    (225487) : Current worker (225489) exited with code 143 (Terminated)
Oct  2 08:08:45 np0005466012 neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846[225483]: [WARNING]  (225487) : All workers exited. Exiting... (0)
Oct  2 08:08:45 np0005466012 systemd[1]: libpod-a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4.scope: Deactivated successfully.
Oct  2 08:08:45 np0005466012 conmon[225483]: conmon a34663e4f664b6218652 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4.scope/container/memory.events
Oct  2 08:08:45 np0005466012 podman[225654]: 2025-10-02 12:08:45.445461715 +0000 UTC m=+0.049790815 container died a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:08:45 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4-userdata-shm.mount: Deactivated successfully.
Oct  2 08:08:45 np0005466012 systemd[1]: var-lib-containers-storage-overlay-1c177b64f1fbbba8d352527d0ca8b0f2cf7ec1662ebffa698015548b4b246a70-merged.mount: Deactivated successfully.
Oct  2 08:08:45 np0005466012 podman[225654]: 2025-10-02 12:08:45.478879356 +0000 UTC m=+0.083208466 container cleanup a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:08:45 np0005466012 systemd[1]: libpod-conmon-a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4.scope: Deactivated successfully.
Oct  2 08:08:45 np0005466012 podman[225684]: 2025-10-02 12:08:45.544951179 +0000 UTC m=+0.045251883 container remove a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.550 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef4190a-a457-4b60-bee2-5c3d4a31d015]: (4, ('Thu Oct  2 12:08:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846 (a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4)\na34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4\nThu Oct  2 12:08:45 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846 (a34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4)\na34663e4f664b6218652814cb9bb1fa2911ef4fd5a1e5a2dd43ec9c3392bf8f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.552 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cd6ceb-f140-4a6a-8315-3f603c20b4cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.553 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap200c8157-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 kernel: tap200c8157-10: left promiscuous mode
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 nova_compute[192063]: 2025-10-02 12:08:45.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.568 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec2000c-1710-4c63-91ca-c299d8ee8882]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.603 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[37a94769-fba5-497f-aaaf-8af228ce04b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.604 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9f76bfa9-7503-4191-858e-3f76d033c0db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.621 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6646bcd2-1683-4277-95ab-28a818fc289e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490589, 'reachable_time': 27740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225697, 'error': None, 'target': 'ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:45 np0005466012 systemd[1]: run-netns-ovnmeta\x2d200c8157\x2d1b5e\x2d41ff\x2db9b2\x2d0e661c71b846.mount: Deactivated successfully.
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.625 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-200c8157-1b5e-41ff-b9b2-0e661c71b846 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:08:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:08:45.625 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[becefaa7-02ae-4da7-a6a8-206ea3557054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.020 2 DEBUG nova.compute.manager [req-42f5bbbf-2304-40b9-804d-a2f5f281cef3 req-c383d173-7d59-4ea7-97ea-0bc89a3fd0be 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-unplugged-02499d0d-7019-4582-bbf9-13838dde8102 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.021 2 DEBUG oslo_concurrency.lockutils [req-42f5bbbf-2304-40b9-804d-a2f5f281cef3 req-c383d173-7d59-4ea7-97ea-0bc89a3fd0be 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.022 2 DEBUG oslo_concurrency.lockutils [req-42f5bbbf-2304-40b9-804d-a2f5f281cef3 req-c383d173-7d59-4ea7-97ea-0bc89a3fd0be 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.022 2 DEBUG oslo_concurrency.lockutils [req-42f5bbbf-2304-40b9-804d-a2f5f281cef3 req-c383d173-7d59-4ea7-97ea-0bc89a3fd0be 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.023 2 DEBUG nova.compute.manager [req-42f5bbbf-2304-40b9-804d-a2f5f281cef3 req-c383d173-7d59-4ea7-97ea-0bc89a3fd0be 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-unplugged-02499d0d-7019-4582-bbf9-13838dde8102 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.023 2 DEBUG nova.compute.manager [req-42f5bbbf-2304-40b9-804d-a2f5f281cef3 req-c383d173-7d59-4ea7-97ea-0bc89a3fd0be 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-unplugged-02499d0d-7019-4582-bbf9-13838dde8102 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.094 2 DEBUG nova.compute.manager [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.094 2 DEBUG oslo_concurrency.lockutils [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.095 2 DEBUG oslo_concurrency.lockutils [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.095 2 DEBUG oslo_concurrency.lockutils [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.095 2 DEBUG nova.compute.manager [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.096 2 WARNING nova.compute.manager [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received unexpected event network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.096 2 DEBUG nova.compute.manager [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.096 2 DEBUG oslo_concurrency.lockutils [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.097 2 DEBUG oslo_concurrency.lockutils [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.097 2 DEBUG oslo_concurrency.lockutils [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.097 2 DEBUG nova.compute.manager [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.098 2 WARNING nova.compute.manager [req-89bef2b6-f157-4679-a061-aeab248830f8 req-e4aa2a68-d13e-4b87-99ff-6d6d87bf8332 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received unexpected event network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:08:46 np0005466012 podman[225698]: 2025-10-02 12:08:46.146082432 +0000 UTC m=+0.058850378 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct  2 08:08:46 np0005466012 podman[225699]: 2025-10-02 12:08:46.16046315 +0000 UTC m=+0.067757299 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.242 2 DEBUG nova.network.neutron [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updated VIF entry in instance network info cache for port 49930c33-a017-4f19-9070-b61a21d3819b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.243 2 DEBUG nova.network.neutron [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updating instance_info_cache with network_info: [{"id": "02499d0d-7019-4582-bbf9-13838dde8102", "address": "fa:16:3e:0a:22:1f", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02499d0d-70", "ovs_interfaceid": "02499d0d-7019-4582-bbf9-13838dde8102", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.283 2 DEBUG oslo_concurrency.lockutils [req-b237d08b-1ad5-468c-82ec-719429cb21f8 req-d4a4ec7e-5aa8-4169-acc9-e86cacca9729 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.692 2 DEBUG nova.compute.manager [req-f5fcfdec-45b0-475f-a9a3-42b3aaa63e39 req-6a35ad39-8bc4-4417-b92b-85652332260d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-deleted-02499d0d-7019-4582-bbf9-13838dde8102 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.693 2 INFO nova.compute.manager [req-f5fcfdec-45b0-475f-a9a3-42b3aaa63e39 req-6a35ad39-8bc4-4417-b92b-85652332260d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Neutron deleted interface 02499d0d-7019-4582-bbf9-13838dde8102; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.693 2 DEBUG nova.network.neutron [req-f5fcfdec-45b0-475f-a9a3-42b3aaa63e39 req-6a35ad39-8bc4-4417-b92b-85652332260d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updating instance_info_cache with network_info: [{"id": "49930c33-a017-4f19-9070-b61a21d3819b", "address": "fa:16:3e:1d:30:8e", "network": {"id": "200c8157-1b5e-41ff-b9b2-0e661c71b846", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1993434616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81fba2a656c54f78822b8aee58f6dd5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49930c33-a0", "ovs_interfaceid": "49930c33-a017-4f19-9070-b61a21d3819b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:46 np0005466012 nova_compute[192063]: 2025-10-02 12:08:46.726 2 DEBUG nova.compute.manager [req-f5fcfdec-45b0-475f-a9a3-42b3aaa63e39 req-6a35ad39-8bc4-4417-b92b-85652332260d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Detach interface failed, port_id=02499d0d-7019-4582-bbf9-13838dde8102, reason: Instance df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.041 2 DEBUG nova.network.neutron [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.059 2 INFO nova.compute.manager [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Took 1.62 seconds to deallocate network for instance.#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.161 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.161 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.236 2 DEBUG nova.compute.provider_tree [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.253 2 DEBUG nova.scheduler.client.report [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.276 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.327 2 INFO nova.scheduler.client.report [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Deleted allocations for instance df519aaf-a62d-4f62-a5ab-3bd7f485b3b6#033[00m
Oct  2 08:08:47 np0005466012 nova_compute[192063]: 2025-10-02 12:08:47.400 2 DEBUG oslo_concurrency.lockutils [None req-e9efd228-9726-4840-a517-c0b1f9c61cc8 247c672a9b2842c0800bda6dc0c82fcd 81fba2a656c54f78822b8aee58f6dd5b - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.126 2 DEBUG nova.compute.manager [req-9ce6f799-0d50-4903-a2c5-84d2b360c00d req-8030996d-9078-4de5-b735-e3ec2a028e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.126 2 DEBUG oslo_concurrency.lockutils [req-9ce6f799-0d50-4903-a2c5-84d2b360c00d req-8030996d-9078-4de5-b735-e3ec2a028e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.127 2 DEBUG oslo_concurrency.lockutils [req-9ce6f799-0d50-4903-a2c5-84d2b360c00d req-8030996d-9078-4de5-b735-e3ec2a028e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.127 2 DEBUG oslo_concurrency.lockutils [req-9ce6f799-0d50-4903-a2c5-84d2b360c00d req-8030996d-9078-4de5-b735-e3ec2a028e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.127 2 DEBUG nova.compute.manager [req-9ce6f799-0d50-4903-a2c5-84d2b360c00d req-8030996d-9078-4de5-b735-e3ec2a028e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.128 2 WARNING nova.compute.manager [req-9ce6f799-0d50-4903-a2c5-84d2b360c00d req-8030996d-9078-4de5-b735-e3ec2a028e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received unexpected event network-vif-plugged-02499d0d-7019-4582-bbf9-13838dde8102 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.276 2 DEBUG nova.compute.manager [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-unplugged-49930c33-a017-4f19-9070-b61a21d3819b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.277 2 DEBUG oslo_concurrency.lockutils [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.278 2 DEBUG oslo_concurrency.lockutils [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.278 2 DEBUG oslo_concurrency.lockutils [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.279 2 DEBUG nova.compute.manager [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-unplugged-49930c33-a017-4f19-9070-b61a21d3819b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.279 2 WARNING nova.compute.manager [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received unexpected event network-vif-unplugged-49930c33-a017-4f19-9070-b61a21d3819b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.279 2 DEBUG nova.compute.manager [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.280 2 DEBUG oslo_concurrency.lockutils [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.281 2 DEBUG oslo_concurrency.lockutils [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.281 2 DEBUG oslo_concurrency.lockutils [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "df519aaf-a62d-4f62-a5ab-3bd7f485b3b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.282 2 DEBUG nova.compute.manager [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] No waiting events found dispatching network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.282 2 WARNING nova.compute.manager [req-e5dec24a-f8e0-424d-b6c9-f3b9f94b40cb req-769e4191-b9f9-4de7-9357-1c925671ce45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received unexpected event network-vif-plugged-49930c33-a017-4f19-9070-b61a21d3819b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:08:48 np0005466012 nova_compute[192063]: 2025-10-02 12:08:48.815 2 DEBUG nova.compute.manager [req-c2af8b69-e510-4873-b819-cfb8a7009737 req-9aa58e83-ff80-4b36-ba9d-6df7bb24dbfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Received event network-vif-deleted-49930c33-a017-4f19-9070-b61a21d3819b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:08:49 np0005466012 nova_compute[192063]: 2025-10-02 12:08:49.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:50 np0005466012 nova_compute[192063]: 2025-10-02 12:08:50.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:50 np0005466012 nova_compute[192063]: 2025-10-02 12:08:50.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:55 np0005466012 nova_compute[192063]: 2025-10-02 12:08:55.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:55 np0005466012 nova_compute[192063]: 2025-10-02 12:08:55.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.434 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.435 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.454 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.584 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.585 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.592 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.592 2 INFO nova.compute.claims [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.714 2 DEBUG nova.compute.provider_tree [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.729 2 DEBUG nova.scheduler.client.report [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.749 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.750 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.816 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.817 2 DEBUG nova.network.neutron [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.839 2 INFO nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.863 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:08:58 np0005466012 nova_compute[192063]: 2025-10-02 12:08:58.998 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.000 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.000 2 INFO nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Creating image(s)#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.001 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.003 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.004 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.017 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.093 2 DEBUG nova.policy [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.118 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.119 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.119 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.129 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.209 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.210 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.256 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.258 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.258 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.323 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.325 2 DEBUG nova.virt.disk.api [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Checking if we can resize image /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.326 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.396 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.399 2 DEBUG nova.virt.disk.api [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Cannot resize image /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.400 2 DEBUG nova.objects.instance [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'migration_context' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.417 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.418 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Ensure instance console log exists: /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.419 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.419 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:08:59 np0005466012 nova_compute[192063]: 2025-10-02 12:08:59.420 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:00 np0005466012 nova_compute[192063]: 2025-10-02 12:09:00.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:00 np0005466012 nova_compute[192063]: 2025-10-02 12:09:00.305 2 DEBUG nova.network.neutron [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully created port: 88b42333-8838-4199-ab64-5b879b907aa5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:09:00 np0005466012 nova_compute[192063]: 2025-10-02 12:09:00.325 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406925.3240623, df519aaf-a62d-4f62-a5ab-3bd7f485b3b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:00 np0005466012 nova_compute[192063]: 2025-10-02 12:09:00.326 2 INFO nova.compute.manager [-] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:09:00 np0005466012 nova_compute[192063]: 2025-10-02 12:09:00.351 2 DEBUG nova.compute.manager [None req-23d9a7d7-dac9-44ff-a31b-3eba0460391e - - - - - -] [instance: df519aaf-a62d-4f62-a5ab-3bd7f485b3b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:00 np0005466012 nova_compute[192063]: 2025-10-02 12:09:00.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:01 np0005466012 podman[225756]: 2025-10-02 12:09:01.144551391 +0000 UTC m=+0.049158836 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:09:01 np0005466012 podman[225757]: 2025-10-02 12:09:01.227644453 +0000 UTC m=+0.125631770 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  2 08:09:01 np0005466012 nova_compute[192063]: 2025-10-02 12:09:01.923 2 DEBUG nova.network.neutron [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully updated port: 88b42333-8838-4199-ab64-5b879b907aa5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:01 np0005466012 nova_compute[192063]: 2025-10-02 12:09:01.949 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:01 np0005466012 nova_compute[192063]: 2025-10-02 12:09:01.949 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:01 np0005466012 nova_compute[192063]: 2025-10-02 12:09:01.949 2 DEBUG nova.network.neutron [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:02 np0005466012 nova_compute[192063]: 2025-10-02 12:09:02.027 2 DEBUG nova.compute.manager [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-changed-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:02 np0005466012 nova_compute[192063]: 2025-10-02 12:09:02.027 2 DEBUG nova.compute.manager [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing instance network info cache due to event network-changed-88b42333-8838-4199-ab64-5b879b907aa5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:02 np0005466012 nova_compute[192063]: 2025-10-02 12:09:02.027 2 DEBUG oslo_concurrency.lockutils [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:02 np0005466012 nova_compute[192063]: 2025-10-02 12:09:02.112 2 DEBUG nova.network.neutron [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:09:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:02.119 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:02.120 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:02.120 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:03 np0005466012 podman[225807]: 2025-10-02 12:09:03.143655853 +0000 UTC m=+0.060749770 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.525 2 DEBUG nova.network.neutron [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.554 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.555 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Instance network_info: |[{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.555 2 DEBUG oslo_concurrency.lockutils [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.555 2 DEBUG nova.network.neutron [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing network info cache for port 88b42333-8838-4199-ab64-5b879b907aa5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.559 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Start _get_guest_xml network_info=[{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.564 2 WARNING nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.569 2 DEBUG nova.virt.libvirt.host [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.570 2 DEBUG nova.virt.libvirt.host [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.579 2 DEBUG nova.virt.libvirt.host [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.580 2 DEBUG nova.virt.libvirt.host [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.581 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.581 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.582 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.582 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.582 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.583 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.583 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.583 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.584 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.584 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.584 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.584 2 DEBUG nova.virt.hardware [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.589 2 DEBUG nova.virt.libvirt.vif [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:08:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.589 2 DEBUG nova.network.os_vif_util [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.590 2 DEBUG nova.network.os_vif_util [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.591 2 DEBUG nova.objects.instance [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.611 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <uuid>0adb6d19-d425-4600-9dd0-ca11095b3c59</uuid>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <name>instance-0000002e</name>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:09:03</nova:creationTime>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <entry name="serial">0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <entry name="uuid">0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:b5:e2:ea"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <target dev="tap88b42333-88"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log" append="off"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:09:03 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:03 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:03 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:03 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.612 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Preparing to wait for external event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.612 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.613 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.613 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.614 2 DEBUG nova.virt.libvirt.vif [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:08:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.614 2 DEBUG nova.network.os_vif_util [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.615 2 DEBUG nova.network.os_vif_util [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.615 2 DEBUG os_vif [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.616 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.617 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.621 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88b42333-88, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.621 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88b42333-88, col_values=(('external_ids', {'iface-id': '88b42333-8838-4199-ab64-5b879b907aa5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:e2:ea', 'vm-uuid': '0adb6d19-d425-4600-9dd0-ca11095b3c59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:03 np0005466012 NetworkManager[51207]: <info>  [1759406943.6250] manager: (tap88b42333-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.630 2 INFO os_vif [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88')#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.682 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.682 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.683 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:b5:e2:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:03 np0005466012 nova_compute[192063]: 2025-10-02 12:09:03.683 2 INFO nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Using config drive#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.192 2 INFO nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Creating config drive at /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.198 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp05zhvrrm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.336 2 DEBUG oslo_concurrency.processutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp05zhvrrm" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:04 np0005466012 kernel: tap88b42333-88: entered promiscuous mode
Oct  2 08:09:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:04Z|00155|binding|INFO|Claiming lport 88b42333-8838-4199-ab64-5b879b907aa5 for this chassis.
Oct  2 08:09:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:04Z|00156|binding|INFO|88b42333-8838-4199-ab64-5b879b907aa5: Claiming fa:16:3e:b5:e2:ea 10.100.0.10
Oct  2 08:09:04 np0005466012 NetworkManager[51207]: <info>  [1759406944.4143] manager: (tap88b42333-88): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.427 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:e2:ea 10.100.0.10'], port_security=['fa:16:3e:b5:e2:ea 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b489bfac-287c-4bcc-881b-f0347b4a08b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=88b42333-8838-4199-ab64-5b879b907aa5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.429 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 88b42333-8838-4199-ab64-5b879b907aa5 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.430 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:04 np0005466012 systemd-udevd[225846]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:04 np0005466012 systemd-machined[152114]: New machine qemu-22-instance-0000002e.
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.447 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7ca00f-f93c-4b48-8dd5-8060057cd0f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.448 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7d845a33-51 in ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.450 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7d845a33-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.450 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[51e5154d-6800-4fc7-8b7a-c28d66403452]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.451 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9d04d59b-3233-4db4-a50d-f4a335385787]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 NetworkManager[51207]: <info>  [1759406944.4602] device (tap88b42333-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:04 np0005466012 NetworkManager[51207]: <info>  [1759406944.4617] device (tap88b42333-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.463 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[4a34cc43-0374-4034-9624-1cc5c75535f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:04Z|00157|binding|INFO|Setting lport 88b42333-8838-4199-ab64-5b879b907aa5 ovn-installed in OVS
Oct  2 08:09:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:04Z|00158|binding|INFO|Setting lport 88b42333-8838-4199-ab64-5b879b907aa5 up in Southbound
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 systemd[1]: Started Virtual Machine qemu-22-instance-0000002e.
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.487 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0ace3f-d1d9-4a17-a0f8-b8f23cb49f11]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.511 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[85fd06f9-ce5b-41a9-af38-a1e29ef9c835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.516 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8230014a-52ae-4381-83d4-1b911e7a486e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 systemd-udevd[225849]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:04 np0005466012 NetworkManager[51207]: <info>  [1759406944.5216] manager: (tap7d845a33-50): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.573 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9d900fad-ac70-49b7-a1d2-58a4abaef359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.575 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[90ce6f1b-0b4e-43ff-8f01-5aa548e32a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 NetworkManager[51207]: <info>  [1759406944.6000] device (tap7d845a33-50): carrier: link connected
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.607 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2481e5-7c59-4da8-824e-54d2adb9b337]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.628 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1353650-9224-4d08-850f-da6f1bcb6d8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225878, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.644 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08d9bdc5-41fd-4935-8d48-fd9fc25d1317]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:9016'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493821, 'tstamp': 493821}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225879, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.662 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ab1444-09dc-406b-87b4-2708def04c20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 196, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 196, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225881, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.709 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[11c92797-d778-4da1-8ef1-0b6d9c776404]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.788 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6a57455f-4a4c-455a-af20-afb57d687336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.789 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.789 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.790 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 NetworkManager[51207]: <info>  [1759406944.7919] manager: (tap7d845a33-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Oct  2 08:09:04 np0005466012 kernel: tap7d845a33-50: entered promiscuous mode
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.797 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:04Z|00159|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.799 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.800 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0dd032-89bb-4c64-bc6e-5bd0767c660c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.801 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-7d845a33-56e0-4850-9f27-8a54095796f2
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 7d845a33-56e0-4850-9f27-8a54095796f2
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:09:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:04.803 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'env', 'PROCESS_TAG=haproxy-7d845a33-56e0-4850-9f27-8a54095796f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7d845a33-56e0-4850-9f27-8a54095796f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.927 2 DEBUG nova.compute.manager [req-c8a8efed-f4e8-47bc-8de5-52edfb67d6c6 req-9ba79813-4177-4fe7-9f9e-dfc9422d5786 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.928 2 DEBUG oslo_concurrency.lockutils [req-c8a8efed-f4e8-47bc-8de5-52edfb67d6c6 req-9ba79813-4177-4fe7-9f9e-dfc9422d5786 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.928 2 DEBUG oslo_concurrency.lockutils [req-c8a8efed-f4e8-47bc-8de5-52edfb67d6c6 req-9ba79813-4177-4fe7-9f9e-dfc9422d5786 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.929 2 DEBUG oslo_concurrency.lockutils [req-c8a8efed-f4e8-47bc-8de5-52edfb67d6c6 req-9ba79813-4177-4fe7-9f9e-dfc9422d5786 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:04 np0005466012 nova_compute[192063]: 2025-10-02 12:09:04.929 2 DEBUG nova.compute.manager [req-c8a8efed-f4e8-47bc-8de5-52edfb67d6c6 req-9ba79813-4177-4fe7-9f9e-dfc9422d5786 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Processing event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.023 2 DEBUG nova.network.neutron [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updated VIF entry in instance network info cache for port 88b42333-8838-4199-ab64-5b879b907aa5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.024 2 DEBUG nova.network.neutron [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.048 2 DEBUG oslo_concurrency.lockutils [req-583a2719-4c23-4e2f-b3c6-0cd7ed2fd0b9 req-c44559fc-5d03-41b5-92f1-09de0a0c10b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:05 np0005466012 podman[225919]: 2025-10-02 12:09:05.186494884 +0000 UTC m=+0.060856082 container create efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.205 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406945.20474, 0adb6d19-d425-4600-9dd0-ca11095b3c59 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.205 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] VM Started (Lifecycle Event)#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.208 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.214 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.220 2 INFO nova.virt.libvirt.driver [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Instance spawned successfully.#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.220 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.228 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.233 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:05 np0005466012 systemd[1]: Started libpod-conmon-efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757.scope.
Oct  2 08:09:05 np0005466012 podman[225919]: 2025-10-02 12:09:05.15111138 +0000 UTC m=+0.025472608 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.244 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.245 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.245 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.246 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.246 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.247 2 DEBUG nova.virt.libvirt.driver [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.264 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.264 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406945.207747, 0adb6d19-d425-4600-9dd0-ca11095b3c59 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.264 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:09:05 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:09:05 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7dcb56641f67c59054348ab1508cb4642b6521b7bb43d114c4d6c83e0761d7ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.294 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.298 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406945.2128427, 0adb6d19-d425-4600-9dd0-ca11095b3c59 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.299 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:09:05 np0005466012 podman[225919]: 2025-10-02 12:09:05.306475211 +0000 UTC m=+0.180836409 container init efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:09:05 np0005466012 podman[225919]: 2025-10-02 12:09:05.31162068 +0000 UTC m=+0.185981878 container start efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.323 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:05 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [NOTICE]   (225939) : New worker (225941) forked
Oct  2 08:09:05 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [NOTICE]   (225939) : Loading success.
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.341 2 INFO nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Took 6.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.341 2 DEBUG nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.342 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.372 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.422 2 INFO nova.compute.manager [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Took 6.88 seconds to build instance.#033[00m
Oct  2 08:09:05 np0005466012 nova_compute[192063]: 2025-10-02 12:09:05.441 2 DEBUG oslo_concurrency.lockutils [None req-0de3c38c-5cb3-4f38-9e32-7e59742b9481 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:07 np0005466012 nova_compute[192063]: 2025-10-02 12:09:07.039 2 DEBUG nova.compute.manager [req-45352249-61cb-4f5c-a942-e93d0a0a0914 req-605d970b-b367-44d3-86bd-cf96ab89d528 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:07 np0005466012 nova_compute[192063]: 2025-10-02 12:09:07.039 2 DEBUG oslo_concurrency.lockutils [req-45352249-61cb-4f5c-a942-e93d0a0a0914 req-605d970b-b367-44d3-86bd-cf96ab89d528 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:07 np0005466012 nova_compute[192063]: 2025-10-02 12:09:07.039 2 DEBUG oslo_concurrency.lockutils [req-45352249-61cb-4f5c-a942-e93d0a0a0914 req-605d970b-b367-44d3-86bd-cf96ab89d528 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:07 np0005466012 nova_compute[192063]: 2025-10-02 12:09:07.040 2 DEBUG oslo_concurrency.lockutils [req-45352249-61cb-4f5c-a942-e93d0a0a0914 req-605d970b-b367-44d3-86bd-cf96ab89d528 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:07 np0005466012 nova_compute[192063]: 2025-10-02 12:09:07.040 2 DEBUG nova.compute.manager [req-45352249-61cb-4f5c-a942-e93d0a0a0914 req-605d970b-b367-44d3-86bd-cf96ab89d528 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:07 np0005466012 nova_compute[192063]: 2025-10-02 12:09:07.040 2 WARNING nova.compute.manager [req-45352249-61cb-4f5c-a942-e93d0a0a0914 req-605d970b-b367-44d3-86bd-cf96ab89d528 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:07 np0005466012 podman[225950]: 2025-10-02 12:09:07.137225581 +0000 UTC m=+0.055787856 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2261] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/73)
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2267] device (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2276] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/74)
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2278] device (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2283] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2287] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2290] device (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 08:09:08 np0005466012 NetworkManager[51207]: <info>  [1759406948.2291] device (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:08Z|00160|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.774 2 DEBUG nova.compute.manager [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-changed-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.775 2 DEBUG nova.compute.manager [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing instance network info cache due to event network-changed-88b42333-8838-4199-ab64-5b879b907aa5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.775 2 DEBUG oslo_concurrency.lockutils [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.775 2 DEBUG oslo_concurrency.lockutils [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:08 np0005466012 nova_compute[192063]: 2025-10-02 12:09:08.775 2 DEBUG nova.network.neutron [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing network info cache for port 88b42333-8838-4199-ab64-5b879b907aa5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:10 np0005466012 nova_compute[192063]: 2025-10-02 12:09:10.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:11 np0005466012 nova_compute[192063]: 2025-10-02 12:09:11.079 2 DEBUG nova.network.neutron [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updated VIF entry in instance network info cache for port 88b42333-8838-4199-ab64-5b879b907aa5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:11 np0005466012 nova_compute[192063]: 2025-10-02 12:09:11.080 2 DEBUG nova.network.neutron [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:11 np0005466012 nova_compute[192063]: 2025-10-02 12:09:11.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:11 np0005466012 nova_compute[192063]: 2025-10-02 12:09:11.217 2 DEBUG oslo_concurrency.lockutils [req-eee20d4a-7d57-4922-9664-d3d86e46a144 req-68fa8529-db79-4a97-8d54-b0c8f7a37aa9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:13 np0005466012 nova_compute[192063]: 2025-10-02 12:09:13.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:13 np0005466012 nova_compute[192063]: 2025-10-02 12:09:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:14 np0005466012 podman[225970]: 2025-10-02 12:09:14.155504614 +0000 UTC m=+0.065646732 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:09:14 np0005466012 podman[225971]: 2025-10-02 12:09:14.172459182 +0000 UTC m=+0.083231627 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350)
Oct  2 08:09:14 np0005466012 nova_compute[192063]: 2025-10-02 12:09:14.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:14 np0005466012 nova_compute[192063]: 2025-10-02 12:09:14.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:14 np0005466012 nova_compute[192063]: 2025-10-02 12:09:14.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.858 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.859 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.859 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.859 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.924 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.984 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:15 np0005466012 nova_compute[192063]: 2025-10-02 12:09:15.985 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:16.031 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:16.032 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.041 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.175 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.177 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5563MB free_disk=73.42718887329102GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.177 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.178 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.251 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.251 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.252 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.296 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.313 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.337 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:09:16 np0005466012 nova_compute[192063]: 2025-10-02 12:09:16.338 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.920 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ef4e3be787374d90a6a236c7f76bd940', 'user_id': 'fbc7616089cb4f78832692487019c83d', 'hostId': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.936 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.936 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.read.bytes volume: 34994 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b93e674-753c-4218-97e8-1a8e692ab471', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.920551', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e572516-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '99410f8e9228e522129c1356839658bc2b1b5f4fad4051c0b764e5fd791c0cb4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 34994, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.920551', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e572fac-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '70c33c11ae00fea31fde36cbb3dfdf2c65e45aa99e9caacb74553938b8e7a0a1'}]}, 'timestamp': '2025-10-02 12:09:16.936905', '_unique_id': '6981a8a1cdaf4ec39d4829b073891d71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.937 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.940 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0adb6d19-d425-4600-9dd0-ca11095b3c59 / tap88b42333-88 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.941 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f602e17-f192-469e-9d86-12a828bab345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.938523', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e57e4ba-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'ebb2568327c7a75202a619bbb29c83be1b685e1e2482ca0f0208e2c2c52f78d6'}]}, 'timestamp': '2025-10-02 12:09:16.941650', '_unique_id': 'd95877518aa94a6f8c7d95ab65666445'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.942 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.943 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.953 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.953 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d383fae-866b-4748-8093-891807512b04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.943462', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e59ba6a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.619863136, 'message_signature': '3f08ea094f6ffdfac89095aa4ed6481131e7650d61636326619a8bbeccffac06'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.943462', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e59c69a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.619863136, 'message_signature': '4b70888adc8af267363d68a024f605cd2a2c29c0a494f29ac3c6bd250eccc75e'}]}, 'timestamp': '2025-10-02 12:09:16.953889', '_unique_id': 'fbf32aa230b749bbaa5a03eff6b0f0dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.954 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.955 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.956 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8554c2c-f59c-4310-ad6e-d0e5e4c97954', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.955794', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e5a1be0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.619863136, 'message_signature': 'fd77353f92c7a61a11bdb508128d02e56760e89ded174b997085cde4115940e8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.955794', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e5a266c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.619863136, 'message_signature': 'df7fb78fe6f72d09053800d877ebf236738793ef16f24bb38d5c2e8bb32c3193'}]}, 'timestamp': '2025-10-02 12:09:16.956352', '_unique_id': '5039471d542444549c7d4de197a2ff38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.957 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba6565d7-fd32-4fac-8b53-4d262e4ec60f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.957911', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5a6dca-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'd6ebe9fd43e7499fbc7bdd1b126d433e40fb8b4434354958b8e1aa1e9d10db33'}]}, 'timestamp': '2025-10-02 12:09:16.958163', '_unique_id': 'f1c80f2d758448b2ad99881803279e84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.959 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51715329-4815-46a3-947c-9a3ff2ae063d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.959348', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5aa65a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'c056c877e6d38e1dd6150fc9ef955a2bfd5d31296d52e5f5224f8872bfecb3be'}]}, 'timestamp': '2025-10-02 12:09:16.959609', '_unique_id': '1a17dcce52144d3e924779d78d5df2a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.960 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edbc4a26-9448-407e-a48f-0fd0adecc917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.960857', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e5ae16a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.619863136, 'message_signature': '9ecf732f96967e6272cb4b73f6ad9040d599b2e63f9df3cf55b84596306261da'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.960857', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e5aeba6-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.619863136, 'message_signature': '4cf066b5b1a7fcdbd55361b5eb0036c7f115ad5257d434de09626cc8e90e7e15'}]}, 'timestamp': '2025-10-02 12:09:16.961374', '_unique_id': 'c7fdbf60626944788b05a521800bdbb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.961 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.962 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.962 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>]
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b79acf8b-e6d8-4726-b38a-65d34842de0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.963020', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5b3610-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': '14a86bf04e836d1be8c9c9a2eb78583090ec2aac656ad9c2d9c860590bcb9a0f'}]}, 'timestamp': '2025-10-02 12:09:16.963306', '_unique_id': '2439ae4055f64defbe667715e2635fdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.964 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.964 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11e96dc4-0bcc-49ad-a67b-2a7cf8dbca99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.964524', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e5b7076-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '4d4cdbc0e68a61b7594b435b3edb90094df61d3285e8984ea0937e7f88fbe502'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.964524', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e5b7ada-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '72641071c11de14a150fdacb3a757d11af4eb2d9e16e1c6d98b63201ba0538ad'}]}, 'timestamp': '2025-10-02 12:09:16.965037', '_unique_id': '7f04590870c2477cbd88f1ac77a6d928'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.966 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.966 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.read.requests volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1398db4-0b0c-4963-afa6-8f0599b4f136', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.966280', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e5bb3e2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': 'd2be08725890444b27ae85272c2db5799f72b42093090cd1430caedaa34233ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 15, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.966280', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e5bbc0c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': 'be911f74cade667a950ced820cde84c4bc50a74066c4cb8f1b6a23bb413d1138'}]}, 'timestamp': '2025-10-02 12:09:16.966732', '_unique_id': '9b20e9f519144078a38737cc7119646f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.967 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9209d97-c781-4a99-9af4-de07ad76330b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.967867', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e5bf1d6-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': 'b81457efca5d0d77d0080f7596895fdd9749f0f99f2e0917c8e457c5eecfa598'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.967867', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e5bf99c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': 'b002aad0970a17e9f7a93df41e27dac945f10b7baf2d750b86dfb9f2813227df'}]}, 'timestamp': '2025-10-02 12:09:16.968299', '_unique_id': 'a1c9df47c037400daae15649950fa235'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.968 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.969 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.969 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.read.latency volume: 1140954924 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.969 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.read.latency volume: 11360815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ab4ec45-b412-4f4a-ac24-a50a6cf617fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1140954924, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.969469', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e5c31fa-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': 'd116497ffaa910ac338c9b122abc836ac840d4b20ae9c227b9870630745ceac3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11360815, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.969469', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e5c3cae-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '906a98141ab83dfde13091fe7018961958cfac18d5e942c3357f63e9b0c52d14'}]}, 'timestamp': '2025-10-02 12:09:16.969994', '_unique_id': 'd291b83cced745f9b0c230357c6236cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc9cc95e-4e02-46e2-a69f-6299daa1e635', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.971058', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5c6eea-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': '0eb452c27d36f57ea3260b841876794d5bf0235ea6535dece0be683bcdf78d73'}]}, 'timestamp': '2025-10-02 12:09:16.971291', '_unique_id': 'ec879b8624094be883cc1a2708c9b011'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.972 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.972 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>]
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.972 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '823ff700-88a1-42fd-8001-30ef0ab2760b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.972617', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5cad10-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'fc5586236218a23a978e797f58e1a2f916812f589163da2125924baf16996254'}]}, 'timestamp': '2025-10-02 12:09:16.972884', '_unique_id': 'fdc105e10f9d4ce1984d91f813cbf842'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.973 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.989 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/cpu volume: 11120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73369f57-e445-4a2b-9326-048fa047d8f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11120000000, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'timestamp': '2025-10-02T12:09:16.974001', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9e5f3b16-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.665522977, 'message_signature': 'ff118515b92f45a595ba38e53f3f519a9a2bbb141671d566c23c836bbd6c20e8'}]}, 'timestamp': '2025-10-02 12:09:16.989668', '_unique_id': 'cbc63d02a47949fb9c37b442a33b2ea2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.990 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.991 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0946436a-9e82-486d-9443-57d08dee9885', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.991313', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5f874c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'af61050fa5ae946f9226292ab031d6625429d4d0e2d41202b791b17825cfc39e'}]}, 'timestamp': '2025-10-02 12:09:16.991599', '_unique_id': '740224a842874cb7a0dae8950858a653'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.992 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcfd7907-e6f6-466f-84eb-aaf39535bc01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.992771', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e5fc036-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'b4ed9b5308e1429cfd6efe91ee8f54bccce044b749ce9d49fc681de13e694801'}]}, 'timestamp': '2025-10-02 12:09:16.993035', '_unique_id': '46f25494465c4feaa370f72683526a00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.993 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32104898-6915-4c05-a5b5-b852d9142f85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'timestamp': '2025-10-02T12:09:16.994259', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9e5ff8b2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.665522977, 'message_signature': 'b3dbf7014a45cdf1ab299d133153f1a9ed1e9a1f001c885938455854fb17baa6'}]}, 'timestamp': '2025-10-02 12:09:16.994470', '_unique_id': '9eccee2593b245288b542bcbafd883ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.994 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.995 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.995 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>]
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.996 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.996 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1675134395>]
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.996 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b0d97e0-0857-4940-9a51-06522e247b36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.996373', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e604b8c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': 'c594e5877bbf30d67f2939717b4e0ecc7d0807d4d9c611cac3bfd04c6f2fe878'}]}, 'timestamp': '2025-10-02 12:09:16.996620', '_unique_id': '13a68c95b9054ccabaf904bdaa0b2ff4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b484fc30-868d-4325-a0e1-2cf46964582b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-vda', 'timestamp': '2025-10-02T12:09:16.997983', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9e608b10-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '194c3cb3ea8f4be4af3c85af70d71e89b0c4c87616d562f23b5ab2b5ba29b79b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59-sda', 'timestamp': '2025-10-02T12:09:16.997983', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'instance-0000002e', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9e6094d4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.596904457, 'message_signature': '43ac8ac361cfc22a90b2db8f7754b13c00f55ac741ced6d2095ba8997bdcd701'}]}, 'timestamp': '2025-10-02 12:09:16.998500', '_unique_id': '16f7d8ec2a974a338a50b2bc5376d896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:16.999 12 DEBUG ceilometer.compute.pollsters [-] 0adb6d19-d425-4600-9dd0-ca11095b3c59/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95d0364a-b5ea-4efc-b01b-5674f6855986', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-0000002e-0adb6d19-d425-4600-9dd0-ca11095b3c59-tap88b42333-88', 'timestamp': '2025-10-02T12:09:16.999613', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1675134395', 'name': 'tap88b42333-88', 'instance_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:e2:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88b42333-88'}, 'message_id': '9e60caa8-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 4950.614874371, 'message_signature': '8919e5398684f99ab91a042b39382aee40b86f645b0d00071019731575d253be'}]}, 'timestamp': '2025-10-02 12:09:16.999854', '_unique_id': 'e12bebef033e4483a144db4301c746d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:09:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:09:17.000 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:09:17 np0005466012 podman[226024]: 2025-10-02 12:09:17.140445147 +0000 UTC m=+0.053048892 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:09:17 np0005466012 podman[226023]: 2025-10-02 12:09:17.173793426 +0000 UTC m=+0.085459036 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:09:17 np0005466012 nova_compute[192063]: 2025-10-02 12:09:17.339 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:17 np0005466012 nova_compute[192063]: 2025-10-02 12:09:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:17 np0005466012 nova_compute[192063]: 2025-10-02 12:09:17.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:09:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:17Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b5:e2:ea 10.100.0.10
Oct  2 08:09:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:17Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b5:e2:ea 10.100.0.10
Oct  2 08:09:18 np0005466012 nova_compute[192063]: 2025-10-02 12:09:18.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:19 np0005466012 nova_compute[192063]: 2025-10-02 12:09:19.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:19 np0005466012 nova_compute[192063]: 2025-10-02 12:09:19.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:09:20 np0005466012 nova_compute[192063]: 2025-10-02 12:09:20.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:20 np0005466012 nova_compute[192063]: 2025-10-02 12:09:20.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:21 np0005466012 nova_compute[192063]: 2025-10-02 12:09:21.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:22 np0005466012 nova_compute[192063]: 2025-10-02 12:09:22.866 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:22 np0005466012 nova_compute[192063]: 2025-10-02 12:09:22.866 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:09:22 np0005466012 nova_compute[192063]: 2025-10-02 12:09:22.867 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:09:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:23.034 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:23 np0005466012 nova_compute[192063]: 2025-10-02 12:09:23.054 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:23 np0005466012 nova_compute[192063]: 2025-10-02 12:09:23.054 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:23 np0005466012 nova_compute[192063]: 2025-10-02 12:09:23.054 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:09:23 np0005466012 nova_compute[192063]: 2025-10-02 12:09:23.054 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:23 np0005466012 nova_compute[192063]: 2025-10-02 12:09:23.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:24 np0005466012 nova_compute[192063]: 2025-10-02 12:09:24.096 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:24 np0005466012 nova_compute[192063]: 2025-10-02 12:09:24.115 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:24 np0005466012 nova_compute[192063]: 2025-10-02 12:09:24.115 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:09:24 np0005466012 nova_compute[192063]: 2025-10-02 12:09:24.116 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:09:24 np0005466012 nova_compute[192063]: 2025-10-02 12:09:24.116 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:09:24 np0005466012 nova_compute[192063]: 2025-10-02 12:09:24.136 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:09:25 np0005466012 nova_compute[192063]: 2025-10-02 12:09:25.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.162 2 DEBUG oslo_concurrency.lockutils [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.162 2 DEBUG oslo_concurrency.lockutils [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.163 2 DEBUG nova.objects.instance [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.190 2 DEBUG nova.objects.instance [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.210 2 DEBUG nova.network.neutron [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.507 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.507 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.527 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.745 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.745 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.751 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.751 2 INFO nova.compute.claims [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.900 2 DEBUG nova.compute.provider_tree [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.919 2 DEBUG nova.scheduler.client.report [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.940 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.941 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:09:26 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.978 2 DEBUG nova.policy [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:26.999 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.000 2 DEBUG nova.network.neutron [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.028 2 INFO nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.046 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.254 2 DEBUG nova.policy [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '725180cfb6174d38a53f3965d04a4916', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53cd9990789640a5b5e28b5beb8b222b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.458 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.459 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.460 2 INFO nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Creating image(s)#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.461 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "/var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.461 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "/var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.462 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "/var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.480 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.558 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.560 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.561 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.585 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.650 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.651 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.690 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.691 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.691 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.747 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.748 2 DEBUG nova.virt.disk.api [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Checking if we can resize image /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.749 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.801 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.802 2 DEBUG nova.virt.disk.api [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Cannot resize image /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.802 2 DEBUG nova.objects.instance [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lazy-loading 'migration_context' on Instance uuid 272390c4-59b3-4d2c-bd09-9ceeffd7b19c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.816 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.816 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Ensure instance console log exists: /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.817 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.817 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:27 np0005466012 nova_compute[192063]: 2025-10-02 12:09:27.817 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:28 np0005466012 nova_compute[192063]: 2025-10-02 12:09:28.001 2 DEBUG nova.network.neutron [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully created port: ee2528cd-3da5-4a68-9377-ee66d47d9945 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:09:28 np0005466012 nova_compute[192063]: 2025-10-02 12:09:28.098 2 DEBUG nova.network.neutron [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Successfully created port: 4f715879-b984-448f-b777-8f1883f96f4d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:09:28 np0005466012 nova_compute[192063]: 2025-10-02 12:09:28.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.030 2 DEBUG nova.network.neutron [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully updated port: ee2528cd-3da5-4a68-9377-ee66d47d9945 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.046 2 DEBUG oslo_concurrency.lockutils [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.047 2 DEBUG oslo_concurrency.lockutils [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.047 2 DEBUG nova.network.neutron [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.084 2 DEBUG nova.network.neutron [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Successfully updated port: 4f715879-b984-448f-b777-8f1883f96f4d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.098 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.098 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquired lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.098 2 DEBUG nova.network.neutron [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.142 2 DEBUG nova.compute.manager [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-changed-ee2528cd-3da5-4a68-9377-ee66d47d9945 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.142 2 DEBUG nova.compute.manager [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing instance network info cache due to event network-changed-ee2528cd-3da5-4a68-9377-ee66d47d9945. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.142 2 DEBUG oslo_concurrency.lockutils [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.188 2 WARNING nova.network.neutron [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:09:29 np0005466012 nova_compute[192063]: 2025-10-02 12:09:29.236 2 DEBUG nova.network.neutron [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.063 2 DEBUG nova.network.neutron [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updating instance_info_cache with network_info: [{"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.084 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Releasing lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.085 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Instance network_info: |[{"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.087 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Start _get_guest_xml network_info=[{"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.092 2 WARNING nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.097 2 DEBUG nova.virt.libvirt.host [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.098 2 DEBUG nova.virt.libvirt.host [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.104 2 DEBUG nova.virt.libvirt.host [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.105 2 DEBUG nova.virt.libvirt.host [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.107 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.107 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.108 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.108 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.108 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.109 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.109 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.109 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.109 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.110 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.110 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.110 2 DEBUG nova.virt.hardware [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.114 2 DEBUG nova.virt.libvirt.vif [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:09:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1016764031',display_name='tempest-FloatingIPsAssociationTestJSON-server-1016764031',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1016764031',id=49,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='53cd9990789640a5b5e28b5beb8b222b',ramdisk_id='',reservation_id='r-ga000w72',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-132128146',owner_user_name='tempest-FloatingIPsAssociationTestJSON-132128146-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:09:27Z,user_data=None,user_id='725180cfb6174d38a53f3965d04a4916',uuid=272390c4-59b3-4d2c-bd09-9ceeffd7b19c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.115 2 DEBUG nova.network.os_vif_util [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converting VIF {"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.116 2 DEBUG nova.network.os_vif_util [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.117 2 DEBUG nova.objects.instance [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lazy-loading 'pci_devices' on Instance uuid 272390c4-59b3-4d2c-bd09-9ceeffd7b19c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.134 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <uuid>272390c4-59b3-4d2c-bd09-9ceeffd7b19c</uuid>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <name>instance-00000031</name>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1016764031</nova:name>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:09:30</nova:creationTime>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:user uuid="725180cfb6174d38a53f3965d04a4916">tempest-FloatingIPsAssociationTestJSON-132128146-project-member</nova:user>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:project uuid="53cd9990789640a5b5e28b5beb8b222b">tempest-FloatingIPsAssociationTestJSON-132128146</nova:project>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        <nova:port uuid="4f715879-b984-448f-b777-8f1883f96f4d">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <entry name="serial">272390c4-59b3-4d2c-bd09-9ceeffd7b19c</entry>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <entry name="uuid">272390c4-59b3-4d2c-bd09-9ceeffd7b19c</entry>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.config"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:af:fc:25"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <target dev="tap4f715879-b9"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/console.log" append="off"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:30 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:30 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.135 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Preparing to wait for external event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.136 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.136 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.136 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.137 2 DEBUG nova.virt.libvirt.vif [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:09:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1016764031',display_name='tempest-FloatingIPsAssociationTestJSON-server-1016764031',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1016764031',id=49,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='53cd9990789640a5b5e28b5beb8b222b',ramdisk_id='',reservation_id='r-ga000w72',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-132128146',owner_user_name='tempest-FloatingIPsAssociationTestJSON-132128146-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:09:27Z,user_data=None,user_id='725180cfb6174d38a53f3965d04a4916',uuid=272390c4-59b3-4d2c-bd09-9ceeffd7b19c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.137 2 DEBUG nova.network.os_vif_util [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converting VIF {"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.138 2 DEBUG nova.network.os_vif_util [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.138 2 DEBUG os_vif [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.139 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.139 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.141 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f715879-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.142 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f715879-b9, col_values=(('external_ids', {'iface-id': '4f715879-b984-448f-b777-8f1883f96f4d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:fc:25', 'vm-uuid': '272390c4-59b3-4d2c-bd09-9ceeffd7b19c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.1446] manager: (tap4f715879-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.152 2 INFO os_vif [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9')#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.207 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.208 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.208 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] No VIF found with MAC fa:16:3e:af:fc:25, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.208 2 INFO nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Using config drive#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.607 2 INFO nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Creating config drive at /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.config#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.617 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpszqubw0y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.746 2 DEBUG oslo_concurrency.processutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpszqubw0y" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:09:30 np0005466012 kernel: tap4f715879-b9: entered promiscuous mode
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.8088] manager: (tap4f715879-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00161|binding|INFO|Claiming lport 4f715879-b984-448f-b777-8f1883f96f4d for this chassis.
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00162|binding|INFO|4f715879-b984-448f-b777-8f1883f96f4d: Claiming fa:16:3e:af:fc:25 10.100.0.5
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.814 2 DEBUG nova.network.neutron [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.818 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:fc:25 10.100.0.5'], port_security=['fa:16:3e:af:fc:25 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '272390c4-59b3-4d2c-bd09-9ceeffd7b19c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53cd9990789640a5b5e28b5beb8b222b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '451fc8d0-64dd-41c6-91ef-df444df65e30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9845a444-54df-440f-9a26-b835473c9d1b, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4f715879-b984-448f-b777-8f1883f96f4d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.819 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4f715879-b984-448f-b777-8f1883f96f4d in datapath e3531c03-dcc1-4c2a-981f-8534850ce14f bound to our chassis#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.821 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3531c03-dcc1-4c2a-981f-8534850ce14f#033[00m
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00163|binding|INFO|Setting lport 4f715879-b984-448f-b777-8f1883f96f4d ovn-installed in OVS
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00164|binding|INFO|Setting lport 4f715879-b984-448f-b777-8f1883f96f4d up in Southbound
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.834 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6358b693-f6ca-4332-ba0c-dddd24f413ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.836 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3531c03-d1 in ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.841 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3531c03-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.841 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[636007ef-08d6-4a08-bfc7-de36a7b2407a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.842 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b0aa129b-ada5-4161-995c-7e7dc89475bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 systemd-udevd[226109]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:30 np0005466012 systemd-machined[152114]: New machine qemu-23-instance-00000031.
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.8574] device (tap4f715879-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.8585] device (tap4f715879-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.856 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[79423c2a-41be-4cec-a368-7a9d884cb7c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.869 2 DEBUG oslo_concurrency.lockutils [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.869 2 DEBUG oslo_concurrency.lockutils [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.870 2 DEBUG nova.network.neutron [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing network info cache for port ee2528cd-3da5-4a68-9377-ee66d47d9945 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:30 np0005466012 systemd[1]: Started Virtual Machine qemu-23-instance-00000031.
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.875 2 DEBUG nova.virt.libvirt.vif [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.875 2 DEBUG nova.network.os_vif_util [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.876 2 DEBUG nova.network.os_vif_util [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.877 2 DEBUG os_vif [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.879 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.880 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.880 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7203d7-120c-4d07-a6d1-2e1e36feae06]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee2528cd-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapee2528cd-3d, col_values=(('external_ids', {'iface-id': 'ee2528cd-3da5-4a68-9377-ee66d47d9945', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:8f:26', 'vm-uuid': '0adb6d19-d425-4600-9dd0-ca11095b3c59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.8867] manager: (tapee2528cd-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.896 2 INFO os_vif [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d')#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.897 2 DEBUG nova.virt.libvirt.vif [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.897 2 DEBUG nova.network.os_vif_util [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.898 2 DEBUG nova.network.os_vif_util [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.901 2 DEBUG nova.virt.libvirt.guest [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:ee:8f:26"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]:  <target dev="tapee2528cd-3d"/>
Oct  2 08:09:30 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:30 np0005466012 nova_compute[192063]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.9113] manager: (tapee2528cd-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Oct  2 08:09:30 np0005466012 kernel: tapee2528cd-3d: entered promiscuous mode
Oct  2 08:09:30 np0005466012 systemd-udevd[226112]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00165|binding|INFO|Claiming lport ee2528cd-3da5-4a68-9377-ee66d47d9945 for this chassis.
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00166|binding|INFO|ee2528cd-3da5-4a68-9377-ee66d47d9945: Claiming fa:16:3e:ee:8f:26 10.100.0.5
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.917 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0e297d73-7f7a-4398-8f80-ae380b7511e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.921 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:8f:26 10.100.0.5'], port_security=['fa:16:3e:ee:8f:26 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=ee2528cd-3da5-4a68-9377-ee66d47d9945) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.9274] device (tapee2528cd-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.9280] device (tapee2528cd-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:30 np0005466012 NetworkManager[51207]: <info>  [1759406970.9307] manager: (tape3531c03-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00167|binding|INFO|Setting lport ee2528cd-3da5-4a68-9377-ee66d47d9945 ovn-installed in OVS
Oct  2 08:09:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:30Z|00168|binding|INFO|Setting lport ee2528cd-3da5-4a68-9377-ee66d47d9945 up in Southbound
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.929 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4405d886-e6c7-4194-96bf-e2658c10c337]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 nova_compute[192063]: 2025-10-02 12:09:30.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.972 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[da0c94b4-b67d-4dcd-8fa9-44d13e4175fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:30.977 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae02ff1-c545-4b75-9123-bf9907bba391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 NetworkManager[51207]: <info>  [1759406971.0033] device (tape3531c03-d0): carrier: link connected
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.008 2 DEBUG nova.virt.libvirt.driver [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.008 2 DEBUG nova.virt.libvirt.driver [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.009 2 DEBUG nova.virt.libvirt.driver [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:b5:e2:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.009 2 DEBUG nova.virt.libvirt.driver [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:ee:8f:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.012 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4c929ac2-79d6-4a89-9046-3307ec5f01a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.030 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3a8ca1-7dbc-483b-af02-1b1074e3f581]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3531c03-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:c6:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496461, 'reachable_time': 35578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226147, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.038 2 DEBUG nova.virt.libvirt.guest [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:31</nova:creationTime>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:31 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    <nova:port uuid="ee2528cd-3da5-4a68-9377-ee66d47d9945">
Oct  2 08:09:31 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:31 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:31 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:31 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.045 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[31f70d87-4f3f-4aa1-8051-325830c1a808]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:c6bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496461, 'tstamp': 496461}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226148, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.060 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0b72b1c2-da22-4ce5-91a3-0899f5ad7bfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3531c03-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:c6:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496461, 'reachable_time': 35578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226149, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.078 2 DEBUG oslo_concurrency.lockutils [None req-04362895-867a-40f4-b7eb-65bb97bea35f fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.090 2 DEBUG nova.compute.manager [req-35954548-c013-48bd-af53-91dc7fe3729b req-cf28900c-628c-42c6-808b-77373a1a62a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.090 2 DEBUG oslo_concurrency.lockutils [req-35954548-c013-48bd-af53-91dc7fe3729b req-cf28900c-628c-42c6-808b-77373a1a62a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.091 2 DEBUG oslo_concurrency.lockutils [req-35954548-c013-48bd-af53-91dc7fe3729b req-cf28900c-628c-42c6-808b-77373a1a62a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.091 2 DEBUG oslo_concurrency.lockutils [req-35954548-c013-48bd-af53-91dc7fe3729b req-cf28900c-628c-42c6-808b-77373a1a62a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.091 2 DEBUG nova.compute.manager [req-35954548-c013-48bd-af53-91dc7fe3729b req-cf28900c-628c-42c6-808b-77373a1a62a1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Processing event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.098 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[31b5aaf4-9545-48f6-a0ba-ca6f677cad93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.157 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[75af0fa0-b621-4de0-a534-daf6d08d3f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.159 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3531c03-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.160 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.160 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3531c03-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 NetworkManager[51207]: <info>  [1759406971.1979] manager: (tape3531c03-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Oct  2 08:09:31 np0005466012 kernel: tape3531c03-d0: entered promiscuous mode
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.200 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3531c03-d0, col_values=(('external_ids', {'iface-id': 'a22200c1-7efb-4203-8e94-7851356dbd00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:31Z|00169|binding|INFO|Releasing lport a22200c1-7efb-4203-8e94-7851356dbd00 from this chassis (sb_readonly=0)
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.203 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3531c03-dcc1-4c2a-981f-8534850ce14f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3531c03-dcc1-4c2a-981f-8534850ce14f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.203 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f264ed5c-98fc-4fc5-bac0-4a573c7af0c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.204 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-e3531c03-dcc1-4c2a-981f-8534850ce14f
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/e3531c03-dcc1-4c2a-981f-8534850ce14f.pid.haproxy
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID e3531c03-dcc1-4c2a-981f-8534850ce14f
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.205 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'env', 'PROCESS_TAG=haproxy-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3531c03-dcc1-4c2a-981f-8534850ce14f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.247 2 DEBUG nova.compute.manager [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-changed-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.247 2 DEBUG nova.compute.manager [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Refreshing instance network info cache due to event network-changed-4f715879-b984-448f-b777-8f1883f96f4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.247 2 DEBUG oslo_concurrency.lockutils [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.248 2 DEBUG oslo_concurrency.lockutils [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.248 2 DEBUG nova.network.neutron [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Refreshing network info cache for port 4f715879-b984-448f-b777-8f1883f96f4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:31 np0005466012 podman[226188]: 2025-10-02 12:09:31.620003019 +0000 UTC m=+0.052755521 container create 70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:09:31 np0005466012 systemd[1]: Started libpod-conmon-70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417.scope.
Oct  2 08:09:31 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:09:31 np0005466012 podman[226188]: 2025-10-02 12:09:31.590812583 +0000 UTC m=+0.023565115 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:09:31 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd928830dd570f0a133165d63af68b2ed97cda1b4528f07d9f2b1272ece7fb35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:09:31 np0005466012 podman[226188]: 2025-10-02 12:09:31.706136158 +0000 UTC m=+0.138888680 container init 70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:09:31 np0005466012 podman[226188]: 2025-10-02 12:09:31.711929764 +0000 UTC m=+0.144682266 container start 70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:09:31 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [NOTICE]   (226248) : New worker (226255) forked
Oct  2 08:09:31 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [NOTICE]   (226248) : Loading success.
Oct  2 08:09:31 np0005466012 podman[226201]: 2025-10-02 12:09:31.741396956 +0000 UTC m=+0.085802071 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:09:31 np0005466012 podman[226202]: 2025-10-02 12:09:31.746492724 +0000 UTC m=+0.092215584 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.754 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406971.7532792, 272390c4-59b3-4d2c-bd09-9ceeffd7b19c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.754 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.755 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.758 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.761 2 INFO nova.virt.libvirt.driver [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Instance spawned successfully.#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.761 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.779 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.783 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.789 103246 INFO neutron.agent.ovn.metadata.agent [-] Port ee2528cd-3da5-4a68-9377-ee66d47d9945 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.791 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.794 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.795 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.796 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.796 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.797 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.797 2 DEBUG nova.virt.libvirt.driver [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.806 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0580ca7c-af7e-4da8-8833-199bb87cf597]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.811 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.812 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406971.7535267, 272390c4-59b3-4d2c-bd09-9ceeffd7b19c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.812 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.829 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[84660325-3162-46b7-a4de-a1d6b645b5c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.832 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5cb2f0-c6bf-42f7-b3ac-f91b6e7cbea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.838 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.841 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759406971.7584195, 272390c4-59b3-4d2c-bd09-9ceeffd7b19c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.841 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.853 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[04b165d1-3598-4f90-b82e-b4eb14ae4da8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.871 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f67bc938-efe0-4f1f-825b-f1d3c8f2739e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226271, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.872 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.876 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.883 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[acd2b77b-18ec-4d4a-b37e-3ad1386bfd38]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493835, 'tstamp': 493835}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226272, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493839, 'tstamp': 493839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226272, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.885 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.888 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.888 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.889 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:31.889 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.903 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.912 2 INFO nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Took 4.45 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.912 2 DEBUG nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:09:31 np0005466012 nova_compute[192063]: 2025-10-02 12:09:31.993 2 INFO nova.compute.manager [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Took 5.30 seconds to build instance.#033[00m
Oct  2 08:09:32 np0005466012 nova_compute[192063]: 2025-10-02 12:09:32.010 2 DEBUG oslo_concurrency.lockutils [None req-cff9e319-f126-42da-b7c3-517dce0cf09c 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:32 np0005466012 nova_compute[192063]: 2025-10-02 12:09:32.077 2 DEBUG oslo_concurrency.lockutils [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:32 np0005466012 nova_compute[192063]: 2025-10-02 12:09:32.077 2 DEBUG oslo_concurrency.lockutils [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:32 np0005466012 nova_compute[192063]: 2025-10-02 12:09:32.078 2 DEBUG nova.objects.instance [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:32 np0005466012 nova_compute[192063]: 2025-10-02 12:09:32.663 2 DEBUG nova.objects.instance [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:32 np0005466012 nova_compute[192063]: 2025-10-02 12:09:32.679 2 DEBUG nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:09:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:32Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:8f:26 10.100.0.5
Oct  2 08:09:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:32Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:8f:26 10.100.0.5
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.414 2 DEBUG nova.compute.manager [req-5ff11b97-a447-4c53-8b37-44b1cc81a08e req-94b4d675-564c-4eca-8d05-79a2ba7d7274 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.414 2 DEBUG oslo_concurrency.lockutils [req-5ff11b97-a447-4c53-8b37-44b1cc81a08e req-94b4d675-564c-4eca-8d05-79a2ba7d7274 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.415 2 DEBUG oslo_concurrency.lockutils [req-5ff11b97-a447-4c53-8b37-44b1cc81a08e req-94b4d675-564c-4eca-8d05-79a2ba7d7274 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.415 2 DEBUG oslo_concurrency.lockutils [req-5ff11b97-a447-4c53-8b37-44b1cc81a08e req-94b4d675-564c-4eca-8d05-79a2ba7d7274 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.415 2 DEBUG nova.compute.manager [req-5ff11b97-a447-4c53-8b37-44b1cc81a08e req-94b4d675-564c-4eca-8d05-79a2ba7d7274 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.415 2 WARNING nova.compute.manager [req-5ff11b97-a447-4c53-8b37-44b1cc81a08e req-94b4d675-564c-4eca-8d05-79a2ba7d7274 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.613 2 DEBUG nova.network.neutron [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updated VIF entry in instance network info cache for port ee2528cd-3da5-4a68-9377-ee66d47d9945. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.613 2 DEBUG nova.network.neutron [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.632 2 DEBUG oslo_concurrency.lockutils [req-d08e6f10-db48-4ecb-ad05-5b502953bd9c req-9defb4a3-db40-4437-b0d7-d362f3787ed4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.638 2 DEBUG nova.policy [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.749 2 DEBUG nova.network.neutron [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updated VIF entry in instance network info cache for port 4f715879-b984-448f-b777-8f1883f96f4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.749 2 DEBUG nova.network.neutron [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updating instance_info_cache with network_info: [{"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.769 2 DEBUG oslo_concurrency.lockutils [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.769 2 DEBUG nova.compute.manager [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.770 2 DEBUG oslo_concurrency.lockutils [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.770 2 DEBUG oslo_concurrency.lockutils [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.770 2 DEBUG oslo_concurrency.lockutils [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.770 2 DEBUG nova.compute.manager [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.770 2 WARNING nova.compute.manager [req-514b437f-fd84-4670-b4b0-3915ff88d6cf req-8e574008-5764-4d08-9f2c-fc9307a8fb96 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.842 2 DEBUG nova.compute.manager [req-327ce847-7c70-47fd-ac58-b371d66470b4 req-7aa7fa90-837f-4532-a20a-f2cb20d78ec0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.842 2 DEBUG oslo_concurrency.lockutils [req-327ce847-7c70-47fd-ac58-b371d66470b4 req-7aa7fa90-837f-4532-a20a-f2cb20d78ec0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.842 2 DEBUG oslo_concurrency.lockutils [req-327ce847-7c70-47fd-ac58-b371d66470b4 req-7aa7fa90-837f-4532-a20a-f2cb20d78ec0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.843 2 DEBUG oslo_concurrency.lockutils [req-327ce847-7c70-47fd-ac58-b371d66470b4 req-7aa7fa90-837f-4532-a20a-f2cb20d78ec0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.843 2 DEBUG nova.compute.manager [req-327ce847-7c70-47fd-ac58-b371d66470b4 req-7aa7fa90-837f-4532-a20a-f2cb20d78ec0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] No waiting events found dispatching network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:33 np0005466012 nova_compute[192063]: 2025-10-02 12:09:33.843 2 WARNING nova.compute.manager [req-327ce847-7c70-47fd-ac58-b371d66470b4 req-7aa7fa90-837f-4532-a20a-f2cb20d78ec0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received unexpected event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:34 np0005466012 podman[226273]: 2025-10-02 12:09:34.13066396 +0000 UTC m=+0.051224170 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:09:34 np0005466012 nova_compute[192063]: 2025-10-02 12:09:34.640 2 DEBUG nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully created port: e4d9a52e-74d6-4407-a156-6fa4bcad133c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:09:35 np0005466012 nova_compute[192063]: 2025-10-02 12:09:35.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:35 np0005466012 nova_compute[192063]: 2025-10-02 12:09:35.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:36 np0005466012 nova_compute[192063]: 2025-10-02 12:09:36.691 2 DEBUG nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully updated port: e4d9a52e-74d6-4407-a156-6fa4bcad133c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:36 np0005466012 nova_compute[192063]: 2025-10-02 12:09:36.704 2 DEBUG oslo_concurrency.lockutils [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:36 np0005466012 nova_compute[192063]: 2025-10-02 12:09:36.705 2 DEBUG oslo_concurrency.lockutils [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:36 np0005466012 nova_compute[192063]: 2025-10-02 12:09:36.705 2 DEBUG nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:37 np0005466012 nova_compute[192063]: 2025-10-02 12:09:37.231 2 WARNING nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:09:37 np0005466012 nova_compute[192063]: 2025-10-02 12:09:37.232 2 WARNING nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:09:37 np0005466012 nova_compute[192063]: 2025-10-02 12:09:37.854 2 DEBUG nova.compute.manager [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-changed-e4d9a52e-74d6-4407-a156-6fa4bcad133c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:37 np0005466012 nova_compute[192063]: 2025-10-02 12:09:37.854 2 DEBUG nova.compute.manager [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing instance network info cache due to event network-changed-e4d9a52e-74d6-4407-a156-6fa4bcad133c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:37 np0005466012 nova_compute[192063]: 2025-10-02 12:09:37.854 2 DEBUG oslo_concurrency.lockutils [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:38 np0005466012 podman[226292]: 2025-10-02 12:09:38.157083311 +0000 UTC m=+0.072004579 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.471 2 DEBUG nova.network.neutron [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.493 2 DEBUG oslo_concurrency.lockutils [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.494 2 DEBUG oslo_concurrency.lockutils [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.494 2 DEBUG nova.network.neutron [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing network info cache for port e4d9a52e-74d6-4407-a156-6fa4bcad133c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.498 2 DEBUG nova.virt.libvirt.vif [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.498 2 DEBUG nova.network.os_vif_util [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.499 2 DEBUG nova.network.os_vif_util [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.499 2 DEBUG os_vif [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.500 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.501 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.504 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape4d9a52e-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.504 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape4d9a52e-74, col_values=(('external_ids', {'iface-id': 'e4d9a52e-74d6-4407-a156-6fa4bcad133c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:43:1c', 'vm-uuid': '0adb6d19-d425-4600-9dd0-ca11095b3c59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:40 np0005466012 NetworkManager[51207]: <info>  [1759406980.5071] manager: (tape4d9a52e-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.515 2 INFO os_vif [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74')#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.516 2 DEBUG nova.virt.libvirt.vif [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.516 2 DEBUG nova.network.os_vif_util [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.517 2 DEBUG nova.network.os_vif_util [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.519 2 DEBUG nova.virt.libvirt.guest [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:2a:43:1c"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <target dev="tape4d9a52e-74"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:40 np0005466012 nova_compute[192063]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:09:40 np0005466012 NetworkManager[51207]: <info>  [1759406980.5301] manager: (tape4d9a52e-74): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Oct  2 08:09:40 np0005466012 kernel: tape4d9a52e-74: entered promiscuous mode
Oct  2 08:09:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:40Z|00170|binding|INFO|Claiming lport e4d9a52e-74d6-4407-a156-6fa4bcad133c for this chassis.
Oct  2 08:09:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:40Z|00171|binding|INFO|e4d9a52e-74d6-4407-a156-6fa4bcad133c: Claiming fa:16:3e:2a:43:1c 10.100.0.7
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:40Z|00172|binding|INFO|Setting lport e4d9a52e-74d6-4407-a156-6fa4bcad133c ovn-installed in OVS
Oct  2 08:09:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:40Z|00173|binding|INFO|Setting lport e4d9a52e-74d6-4407-a156-6fa4bcad133c up in Southbound
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.560 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:43:1c 10.100.0.7'], port_security=['fa:16:3e:2a:43:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=e4d9a52e-74d6-4407-a156-6fa4bcad133c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.561 103246 INFO neutron.agent.ovn.metadata.agent [-] Port e4d9a52e-74d6-4407-a156-6fa4bcad133c in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.563 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 systemd-udevd[226319]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 NetworkManager[51207]: <info>  [1759406980.5846] device (tape4d9a52e-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:40 np0005466012 NetworkManager[51207]: <info>  [1759406980.5853] device (tape4d9a52e-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.586 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6575a560-a860-4104-ad4e-9bd30b471d7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.631 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[25e9c55b-621b-457e-92e3-12385f398113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.636 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[dbaa845b-addd-4c59-bc85-620061d1be31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.640 2 DEBUG nova.virt.libvirt.driver [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.641 2 DEBUG nova.virt.libvirt.driver [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.641 2 DEBUG nova.virt.libvirt.driver [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:b5:e2:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.641 2 DEBUG nova.virt.libvirt.driver [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:ee:8f:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.641 2 DEBUG nova.virt.libvirt.driver [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:2a:43:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.664 2 DEBUG nova.virt.libvirt.guest [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:40</nova:creationTime>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:40 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:port uuid="ee2528cd-3da5-4a68-9377-ee66d47d9945">
Oct  2 08:09:40 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:40 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:40 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:40 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:40 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.674 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[10edb96a-4c56-4425-8112-badb0fabaa17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.689 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c60d54ff-1cb0-4612-9d7f-c261c69436b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226326, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.702 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[49de1bd3-8177-4026-a1ae-fdf86c21229a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493835, 'tstamp': 493835}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226327, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493839, 'tstamp': 493839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226327, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.703 2 DEBUG oslo_concurrency.lockutils [None req-c2450435-ebaf-4d82-850e-77cd88b7031c fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.704 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 nova_compute[192063]: 2025-10-02 12:09:40.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.707 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.707 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.708 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:40.708 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.001 2 DEBUG nova.compute.manager [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.002 2 DEBUG oslo_concurrency.lockutils [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.002 2 DEBUG oslo_concurrency.lockutils [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.003 2 DEBUG oslo_concurrency.lockutils [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.003 2 DEBUG nova.compute.manager [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.004 2 WARNING nova.compute.manager [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.004 2 DEBUG nova.compute.manager [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.005 2 DEBUG oslo_concurrency.lockutils [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.005 2 DEBUG oslo_concurrency.lockutils [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.006 2 DEBUG oslo_concurrency.lockutils [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.006 2 DEBUG nova.compute.manager [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.007 2 WARNING nova.compute.manager [req-099998a8-75cf-441b-87dc-d4a1b797b25d req-9ec632af-3995-44b4-9323-3415378d8377 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.481 2 DEBUG nova.compute.manager [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-changed-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.482 2 DEBUG nova.compute.manager [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Refreshing instance network info cache due to event network-changed-4f715879-b984-448f-b777-8f1883f96f4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.483 2 DEBUG oslo_concurrency.lockutils [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.483 2 DEBUG oslo_concurrency.lockutils [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:42 np0005466012 nova_compute[192063]: 2025-10-02 12:09:42.483 2 DEBUG nova.network.neutron [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Refreshing network info cache for port 4f715879-b984-448f-b777-8f1883f96f4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:43 np0005466012 nova_compute[192063]: 2025-10-02 12:09:43.057 2 DEBUG nova.network.neutron [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updated VIF entry in instance network info cache for port e4d9a52e-74d6-4407-a156-6fa4bcad133c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:43 np0005466012 nova_compute[192063]: 2025-10-02 12:09:43.057 2 DEBUG nova.network.neutron [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:43 np0005466012 nova_compute[192063]: 2025-10-02 12:09:43.074 2 DEBUG oslo_concurrency.lockutils [req-ea8f0b53-21ab-48b6-8ba9-13ce7d631047 req-946527ab-86a8-414e-9cb9-5c90d12f2c7e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:43Z|00174|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:09:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:43Z|00175|binding|INFO|Releasing lport a22200c1-7efb-4203-8e94-7851356dbd00 from this chassis (sb_readonly=0)
Oct  2 08:09:43 np0005466012 nova_compute[192063]: 2025-10-02 12:09:43.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:43Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2a:43:1c 10.100.0.7
Oct  2 08:09:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:43Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2a:43:1c 10.100.0.7
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.023 2 DEBUG oslo_concurrency.lockutils [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.024 2 DEBUG oslo_concurrency.lockutils [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.025 2 DEBUG nova.objects.instance [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.267 2 DEBUG nova.network.neutron [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updated VIF entry in instance network info cache for port 4f715879-b984-448f-b777-8f1883f96f4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.268 2 DEBUG nova.network.neutron [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updating instance_info_cache with network_info: [{"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.300 2 DEBUG oslo_concurrency.lockutils [req-330bed42-bc51-4bd4-a560-e4e0bfa5d7a0 req-f405f6f4-da07-42eb-a8a0-b51ec3b076a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.673 2 DEBUG nova.objects.instance [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:44 np0005466012 nova_compute[192063]: 2025-10-02 12:09:44.690 2 DEBUG nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:09:45 np0005466012 nova_compute[192063]: 2025-10-02 12:09:45.022 2 DEBUG nova.policy [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:09:45 np0005466012 podman[226343]: 2025-10-02 12:09:45.165876621 +0000 UTC m=+0.065973117 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:09:45 np0005466012 podman[226344]: 2025-10-02 12:09:45.167803093 +0000 UTC m=+0.062319849 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:09:45 np0005466012 nova_compute[192063]: 2025-10-02 12:09:45.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:45 np0005466012 nova_compute[192063]: 2025-10-02 12:09:45.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:45Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:fc:25 10.100.0.5
Oct  2 08:09:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:45Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:fc:25 10.100.0.5
Oct  2 08:09:46 np0005466012 nova_compute[192063]: 2025-10-02 12:09:46.882 2 DEBUG nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Successfully updated port: 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:09:46 np0005466012 nova_compute[192063]: 2025-10-02 12:09:46.898 2 DEBUG oslo_concurrency.lockutils [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:46 np0005466012 nova_compute[192063]: 2025-10-02 12:09:46.898 2 DEBUG oslo_concurrency.lockutils [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:46 np0005466012 nova_compute[192063]: 2025-10-02 12:09:46.899 2 DEBUG nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:47 np0005466012 nova_compute[192063]: 2025-10-02 12:09:47.000 2 DEBUG nova.compute.manager [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-changed-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:47 np0005466012 nova_compute[192063]: 2025-10-02 12:09:47.000 2 DEBUG nova.compute.manager [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing instance network info cache due to event network-changed-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:47 np0005466012 nova_compute[192063]: 2025-10-02 12:09:47.001 2 DEBUG oslo_concurrency.lockutils [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:47 np0005466012 nova_compute[192063]: 2025-10-02 12:09:47.121 2 WARNING nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:09:47 np0005466012 nova_compute[192063]: 2025-10-02 12:09:47.121 2 WARNING nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:09:47 np0005466012 nova_compute[192063]: 2025-10-02 12:09:47.122 2 WARNING nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:09:48 np0005466012 podman[226384]: 2025-10-02 12:09:48.136763051 +0000 UTC m=+0.057951992 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 08:09:48 np0005466012 podman[226385]: 2025-10-02 12:09:48.13712068 +0000 UTC m=+0.053256744 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.431 2 DEBUG nova.compute.manager [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-changed-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.432 2 DEBUG nova.compute.manager [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Refreshing instance network info cache due to event network-changed-4f715879-b984-448f-b777-8f1883f96f4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.432 2 DEBUG oslo_concurrency.lockutils [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.433 2 DEBUG oslo_concurrency.lockutils [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.433 2 DEBUG nova.network.neutron [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Refreshing network info cache for port 4f715879-b984-448f-b777-8f1883f96f4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:50 np0005466012 nova_compute[192063]: 2025-10-02 12:09:50.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.691 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.691 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.692 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.693 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.693 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.718 2 DEBUG nova.network.neutron [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.873 2 INFO nova.compute.manager [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Terminating instance#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.918 2 DEBUG oslo_concurrency.lockutils [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.920 2 DEBUG nova.network.neutron [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updated VIF entry in instance network info cache for port 4f715879-b984-448f-b777-8f1883f96f4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.921 2 DEBUG nova.network.neutron [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updating instance_info_cache with network_info: [{"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.922 2 DEBUG oslo_concurrency.lockutils [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.923 2 DEBUG nova.network.neutron [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Refreshing network info cache for port 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.927 2 DEBUG nova.virt.libvirt.vif [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.927 2 DEBUG nova.network.os_vif_util [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.928 2 DEBUG nova.network.os_vif_util [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.929 2 DEBUG os_vif [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.933 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.933 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1fd711ff-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1fd711ff-37, col_values=(('external_ids', {'iface-id': '1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:9a:48', 'vm-uuid': '0adb6d19-d425-4600-9dd0-ca11095b3c59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 NetworkManager[51207]: <info>  [1759406992.9391] manager: (tap1fd711ff-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.951 2 DEBUG oslo_concurrency.lockutils [req-77635c69-26f8-484b-824b-a8e387d64089 req-69ad2fa1-901a-46b3-a713-c25a97dffd37 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-272390c4-59b3-4d2c-bd09-9ceeffd7b19c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.954 2 INFO os_vif [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37')#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.955 2 DEBUG nova.virt.libvirt.vif [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.955 2 DEBUG nova.network.os_vif_util [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.956 2 DEBUG nova.network.os_vif_util [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.960 2 DEBUG nova.virt.libvirt.guest [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:09:52 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:24:9a:48"/>
Oct  2 08:09:52 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:52 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:52 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:52 np0005466012 nova_compute[192063]:  <target dev="tap1fd711ff-37"/>
Oct  2 08:09:52 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:52 np0005466012 nova_compute[192063]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:09:52 np0005466012 kernel: tap1fd711ff-37: entered promiscuous mode
Oct  2 08:09:52 np0005466012 NetworkManager[51207]: <info>  [1759406992.9730] manager: (tap1fd711ff-37): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:52Z|00176|binding|INFO|Claiming lport 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e for this chassis.
Oct  2 08:09:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:52Z|00177|binding|INFO|1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e: Claiming fa:16:3e:24:9a:48 10.100.0.4
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.981 2 DEBUG nova.compute.manager [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:52Z|00178|binding|INFO|Setting lport 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e ovn-installed in OVS
Oct  2 08:09:52 np0005466012 nova_compute[192063]: 2025-10-02 12:09:52.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 kernel: tap4f715879-b9 (unregistering): left promiscuous mode
Oct  2 08:09:53 np0005466012 NetworkManager[51207]: <info>  [1759406993.0153] device (tap4f715879-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00179|binding|INFO|Releasing lport 4f715879-b984-448f-b777-8f1883f96f4d from this chassis (sb_readonly=1)
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00180|binding|INFO|Removing iface tap4f715879-b9 ovn-installed in OVS
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00181|if_status|INFO|Dropped 2 log messages in last 141 seconds (most recently, 141 seconds ago) due to excessive rate
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00182|if_status|INFO|Not setting lport 4f715879-b984-448f-b777-8f1883f96f4d down as sb is readonly
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 systemd-udevd[226435]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 NetworkManager[51207]: <info>  [1759406993.0469] device (tap1fd711ff-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:09:53 np0005466012 NetworkManager[51207]: <info>  [1759406993.0480] device (tap1fd711ff-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00183|binding|INFO|Setting lport 4f715879-b984-448f-b777-8f1883f96f4d down in Southbound
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00184|binding|INFO|Setting lport 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e up in Southbound
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.065 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:9a:48 10.100.0.4'], port_security=['fa:16:3e:24:9a:48 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2064859050', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2064859050', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.066 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.067 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.081 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a01488fa-c076-4ffd-82aa-542dee05fcf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000031.scope: Deactivated successfully.
Oct  2 08:09:53 np0005466012 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000031.scope: Consumed 13.620s CPU time.
Oct  2 08:09:53 np0005466012 systemd-machined[152114]: Machine qemu-23-instance-00000031 terminated.
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.111 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[48ab12dd-8a72-4ff2-b2fe-1a3b9b2e879f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.115 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ac12364e-4665-4ef7-bb97-1716693bd3b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.145 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ac03f61f-2e24-46f8-8215-abe58cc04d83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.161 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[134161c0-0c65-4658-a632-ad3a55c57c58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 10, 'rx_bytes': 1084, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 10, 'rx_bytes': 1084, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226446, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.177 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[366a8f82-0722-4bf6-9ac8-fb4a00dd8cba]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493835, 'tstamp': 493835}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226447, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493839, 'tstamp': 493839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226447, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.181 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.186 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:fc:25 10.100.0.5'], port_security=['fa:16:3e:af:fc:25 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '272390c4-59b3-4d2c-bd09-9ceeffd7b19c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53cd9990789640a5b5e28b5beb8b222b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '451fc8d0-64dd-41c6-91ef-df444df65e30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9845a444-54df-440f-9a26-b835473c9d1b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4f715879-b984-448f-b777-8f1883f96f4d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.189 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.189 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.189 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.190 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.191 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4f715879-b984-448f-b777-8f1883f96f4d in datapath e3531c03-dcc1-4c2a-981f-8534850ce14f unbound from our chassis#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.192 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3531c03-dcc1-4c2a-981f-8534850ce14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.193 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d559d47f-17e8-4951-b499-63c667c36316]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.193 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f namespace which is not needed anymore#033[00m
Oct  2 08:09:53 np0005466012 kernel: tap4f715879-b9: entered promiscuous mode
Oct  2 08:09:53 np0005466012 NetworkManager[51207]: <info>  [1759406993.2029] manager: (tap4f715879-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00185|binding|INFO|Claiming lport 4f715879-b984-448f-b777-8f1883f96f4d for this chassis.
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00186|binding|INFO|4f715879-b984-448f-b777-8f1883f96f4d: Claiming fa:16:3e:af:fc:25 10.100.0.5
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 kernel: tap4f715879-b9 (unregistering): left promiscuous mode
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.224 2 DEBUG nova.virt.libvirt.driver [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00187|binding|INFO|Setting lport 4f715879-b984-448f-b777-8f1883f96f4d ovn-installed in OVS
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.224 2 DEBUG nova.virt.libvirt.driver [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.225 2 DEBUG nova.virt.libvirt.driver [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:b5:e2:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.225 2 DEBUG nova.virt.libvirt.driver [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:ee:8f:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.225 2 DEBUG nova.virt.libvirt.driver [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:2a:43:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.225 2 DEBUG nova.virt.libvirt.driver [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:24:9a:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:53Z|00188|binding|INFO|Releasing lport 4f715879-b984-448f-b777-8f1883f96f4d from this chassis (sb_readonly=0)
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.265 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:fc:25 10.100.0.5'], port_security=['fa:16:3e:af:fc:25 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '272390c4-59b3-4d2c-bd09-9ceeffd7b19c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53cd9990789640a5b5e28b5beb8b222b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '451fc8d0-64dd-41c6-91ef-df444df65e30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9845a444-54df-440f-9a26-b835473c9d1b, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4f715879-b984-448f-b777-8f1883f96f4d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.265 2 INFO nova.virt.libvirt.driver [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Instance destroyed successfully.#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.265 2 DEBUG nova.objects.instance [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lazy-loading 'resources' on Instance uuid 272390c4-59b3-4d2c-bd09-9ceeffd7b19c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.291 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:fc:25 10.100.0.5'], port_security=['fa:16:3e:af:fc:25 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '272390c4-59b3-4d2c-bd09-9ceeffd7b19c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53cd9990789640a5b5e28b5beb8b222b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '451fc8d0-64dd-41c6-91ef-df444df65e30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9845a444-54df-440f-9a26-b835473c9d1b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4f715879-b984-448f-b777-8f1883f96f4d) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.295 2 DEBUG nova.virt.libvirt.guest [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:53</nova:creationTime>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:53 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:port uuid="ee2528cd-3da5-4a68-9377-ee66d47d9945">
Oct  2 08:09:53 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:53 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:53 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:53 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:53 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:53 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.321 2 DEBUG nova.virt.libvirt.vif [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:09:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1016764031',display_name='tempest-FloatingIPsAssociationTestJSON-server-1016764031',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1016764031',id=49,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='53cd9990789640a5b5e28b5beb8b222b',ramdisk_id='',reservation_id='r-ga000w72',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-132128146',owner_user_name='tempest-FloatingIPsAssociationTestJSON-132128146-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:31Z,user_data=None,user_id='725180cfb6174d38a53f3965d04a4916',uuid=272390c4-59b3-4d2c-bd09-9ceeffd7b19c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.321 2 DEBUG nova.network.os_vif_util [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converting VIF {"id": "4f715879-b984-448f-b777-8f1883f96f4d", "address": "fa:16:3e:af:fc:25", "network": {"id": "e3531c03-dcc1-4c2a-981f-8534850ce14f", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1609714742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "53cd9990789640a5b5e28b5beb8b222b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f715879-b9", "ovs_interfaceid": "4f715879-b984-448f-b777-8f1883f96f4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.322 2 DEBUG nova.network.os_vif_util [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.322 2 DEBUG os_vif [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.324 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f715879-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.329 2 INFO os_vif [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:fc:25,bridge_name='br-int',has_traffic_filtering=True,id=4f715879-b984-448f-b777-8f1883f96f4d,network=Network(e3531c03-dcc1-4c2a-981f-8534850ce14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f715879-b9')#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.330 2 INFO nova.virt.libvirt.driver [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Deleting instance files /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c_del#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.331 2 INFO nova.virt.libvirt.driver [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Deletion of /var/lib/nova/instances/272390c4-59b3-4d2c-bd09-9ceeffd7b19c_del complete#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.360 2 DEBUG oslo_concurrency.lockutils [None req-3ba16269-b7e2-44cd-9635-be7f10c95c44 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:53 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [NOTICE]   (226248) : haproxy version is 2.8.14-c23fe91
Oct  2 08:09:53 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [NOTICE]   (226248) : path to executable is /usr/sbin/haproxy
Oct  2 08:09:53 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [WARNING]  (226248) : Exiting Master process...
Oct  2 08:09:53 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [ALERT]    (226248) : Current worker (226255) exited with code 143 (Terminated)
Oct  2 08:09:53 np0005466012 neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f[226215]: [WARNING]  (226248) : All workers exited. Exiting... (0)
Oct  2 08:09:53 np0005466012 systemd[1]: libpod-70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417.scope: Deactivated successfully.
Oct  2 08:09:53 np0005466012 podman[226474]: 2025-10-02 12:09:53.405696208 +0000 UTC m=+0.101721500 container died 70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.479 2 INFO nova.compute.manager [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.479 2 DEBUG oslo.service.loopingcall [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.480 2 DEBUG nova.compute.manager [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.480 2 DEBUG nova.network.neutron [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:09:53 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417-userdata-shm.mount: Deactivated successfully.
Oct  2 08:09:53 np0005466012 systemd[1]: var-lib-containers-storage-overlay-fd928830dd570f0a133165d63af68b2ed97cda1b4528f07d9f2b1272ece7fb35-merged.mount: Deactivated successfully.
Oct  2 08:09:53 np0005466012 podman[226474]: 2025-10-02 12:09:53.504327452 +0000 UTC m=+0.200352744 container cleanup 70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:09:53 np0005466012 systemd[1]: libpod-conmon-70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417.scope: Deactivated successfully.
Oct  2 08:09:53 np0005466012 podman[226502]: 2025-10-02 12:09:53.578066477 +0000 UTC m=+0.053439420 container remove 70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.583 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[641ef1f8-8e80-45e6-9bab-c7f5e1eef73a]: (4, ('Thu Oct  2 12:09:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f (70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417)\n70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417\nThu Oct  2 12:09:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f (70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417)\n70c70d9dfb52176a112e2af494f9d8b78926939380eed62f6e01e35da7bef417\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.584 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec1bb31-f5fa-49a9-ba81-958288faaae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.585 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3531c03-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 kernel: tape3531c03-d0: left promiscuous mode
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.643 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[de15b406-32db-4c3d-ba90-8c5089b2d23c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.682 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8e56ed34-50cb-45bf-8d37-cbb7de16752a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.683 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca0a36c-a1b3-4ec9-acad-a66de7cc92dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.698 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e59b6d93-2c5b-42dc-99dd-3745bb0a48fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496452, 'reachable_time': 28298, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226518, 'error': None, 'target': 'ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.701 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3531c03-dcc1-4c2a-981f-8534850ce14f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.701 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[da259625-b5cd-4837-b4a3-cd117394e15a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.701 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4f715879-b984-448f-b777-8f1883f96f4d in datapath e3531c03-dcc1-4c2a-981f-8534850ce14f unbound from our chassis#033[00m
Oct  2 08:09:53 np0005466012 systemd[1]: run-netns-ovnmeta\x2de3531c03\x2ddcc1\x2d4c2a\x2d981f\x2d8534850ce14f.mount: Deactivated successfully.
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.703 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3531c03-dcc1-4c2a-981f-8534850ce14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.703 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc95420-2160-4abf-bc79-92c8f7692a01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.704 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4f715879-b984-448f-b777-8f1883f96f4d in datapath e3531c03-dcc1-4c2a-981f-8534850ce14f unbound from our chassis#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.705 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3531c03-dcc1-4c2a-981f-8534850ce14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:09:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:53.705 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c73ea264-a03a-46f4-9ead-9b1d88c65c54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.906 2 DEBUG nova.compute.manager [req-870e2218-805e-4224-ba23-8fae84fed8f5 req-7a570e39-6c71-4ca7-9b92-ce76ca0aad2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.907 2 DEBUG oslo_concurrency.lockutils [req-870e2218-805e-4224-ba23-8fae84fed8f5 req-7a570e39-6c71-4ca7-9b92-ce76ca0aad2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.907 2 DEBUG oslo_concurrency.lockutils [req-870e2218-805e-4224-ba23-8fae84fed8f5 req-7a570e39-6c71-4ca7-9b92-ce76ca0aad2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.907 2 DEBUG oslo_concurrency.lockutils [req-870e2218-805e-4224-ba23-8fae84fed8f5 req-7a570e39-6c71-4ca7-9b92-ce76ca0aad2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.908 2 DEBUG nova.compute.manager [req-870e2218-805e-4224-ba23-8fae84fed8f5 req-7a570e39-6c71-4ca7-9b92-ce76ca0aad2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.908 2 WARNING nova.compute.manager [req-870e2218-805e-4224-ba23-8fae84fed8f5 req-7a570e39-6c71-4ca7-9b92-ce76ca0aad2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.978 2 DEBUG nova.compute.manager [req-dcf70a92-10aa-4e51-9d41-c52cd72998ac req-521c4206-37fe-42df-a27c-ea0a0bda8ccc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-vif-unplugged-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.979 2 DEBUG oslo_concurrency.lockutils [req-dcf70a92-10aa-4e51-9d41-c52cd72998ac req-521c4206-37fe-42df-a27c-ea0a0bda8ccc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.979 2 DEBUG oslo_concurrency.lockutils [req-dcf70a92-10aa-4e51-9d41-c52cd72998ac req-521c4206-37fe-42df-a27c-ea0a0bda8ccc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.979 2 DEBUG oslo_concurrency.lockutils [req-dcf70a92-10aa-4e51-9d41-c52cd72998ac req-521c4206-37fe-42df-a27c-ea0a0bda8ccc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.980 2 DEBUG nova.compute.manager [req-dcf70a92-10aa-4e51-9d41-c52cd72998ac req-521c4206-37fe-42df-a27c-ea0a0bda8ccc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] No waiting events found dispatching network-vif-unplugged-4f715879-b984-448f-b777-8f1883f96f4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:53 np0005466012 nova_compute[192063]: 2025-10-02 12:09:53.980 2 DEBUG nova.compute.manager [req-dcf70a92-10aa-4e51-9d41-c52cd72998ac req-521c4206-37fe-42df-a27c-ea0a0bda8ccc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-vif-unplugged-4f715879-b984-448f-b777-8f1883f96f4d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.706 2 DEBUG nova.network.neutron [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.799 2 DEBUG nova.compute.manager [req-ab2c72b3-3080-490a-88b2-edda6be1babf req-773cb370-43a3-4de7-bd35-23422211817c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-vif-deleted-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.799 2 INFO nova.compute.manager [req-ab2c72b3-3080-490a-88b2-edda6be1babf req-773cb370-43a3-4de7-bd35-23422211817c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Neutron deleted interface 4f715879-b984-448f-b777-8f1883f96f4d; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.800 2 DEBUG nova.network.neutron [req-ab2c72b3-3080-490a-88b2-edda6be1babf req-773cb370-43a3-4de7-bd35-23422211817c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.937 2 INFO nova.compute.manager [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Took 1.46 seconds to deallocate network for instance.#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.942 2 DEBUG nova.compute.manager [req-ab2c72b3-3080-490a-88b2-edda6be1babf req-773cb370-43a3-4de7-bd35-23422211817c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Detach interface failed, port_id=4f715879-b984-448f-b777-8f1883f96f4d, reason: Instance 272390c4-59b3-4d2c-bd09-9ceeffd7b19c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:09:54 np0005466012 nova_compute[192063]: 2025-10-02 12:09:54.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.024 2 DEBUG nova.network.neutron [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updated VIF entry in instance network info cache for port 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.025 2 DEBUG nova.network.neutron [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.078 2 DEBUG oslo_concurrency.lockutils [req-4a92366e-055a-4d12-895b-7810ca87374a req-abbaff6c-5a2a-415f-9cec-1a5005a39dca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.108 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.109 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.194 2 DEBUG nova.compute.provider_tree [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.258 2 DEBUG nova.scheduler.client.report [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:09:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:55Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:9a:48 10.100.0.4
Oct  2 08:09:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:55Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:9a:48 10.100.0.4
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.457 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.527 2 INFO nova.scheduler.client.report [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Deleted allocations for instance 272390c4-59b3-4d2c-bd09-9ceeffd7b19c#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.644 2 DEBUG oslo_concurrency.lockutils [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-ee2528cd-3da5-4a68-9377-ee66d47d9945" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.644 2 DEBUG oslo_concurrency.lockutils [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-ee2528cd-3da5-4a68-9377-ee66d47d9945" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.888 2 DEBUG nova.objects.instance [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:55 np0005466012 nova_compute[192063]: 2025-10-02 12:09:55.971 2 DEBUG oslo_concurrency.lockutils [None req-554a00c2-11c6-4a2c-8321-6520b8871876 725180cfb6174d38a53f3965d04a4916 53cd9990789640a5b5e28b5beb8b222b - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.043 2 DEBUG nova.compute.manager [req-b1a2ad29-4dd1-4eed-9493-06381fb4f918 req-0533a17a-6e08-4ef6-9e4c-e2353121edc0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.044 2 DEBUG oslo_concurrency.lockutils [req-b1a2ad29-4dd1-4eed-9493-06381fb4f918 req-0533a17a-6e08-4ef6-9e4c-e2353121edc0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.044 2 DEBUG oslo_concurrency.lockutils [req-b1a2ad29-4dd1-4eed-9493-06381fb4f918 req-0533a17a-6e08-4ef6-9e4c-e2353121edc0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.044 2 DEBUG oslo_concurrency.lockutils [req-b1a2ad29-4dd1-4eed-9493-06381fb4f918 req-0533a17a-6e08-4ef6-9e4c-e2353121edc0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.044 2 DEBUG nova.compute.manager [req-b1a2ad29-4dd1-4eed-9493-06381fb4f918 req-0533a17a-6e08-4ef6-9e4c-e2353121edc0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.044 2 WARNING nova.compute.manager [req-b1a2ad29-4dd1-4eed-9493-06381fb4f918 req-0533a17a-6e08-4ef6-9e4c-e2353121edc0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.047 2 DEBUG nova.virt.libvirt.vif [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.048 2 DEBUG nova.network.os_vif_util [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.048 2 DEBUG nova.network.os_vif_util [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.050 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.052 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.054 2 DEBUG nova.virt.libvirt.driver [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Attempting to detach device tapee2528cd-3d from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.054 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:ee:8f:26"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <target dev="tapee2528cd-3d"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.060 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.063 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface>not found in domain: <domain type='kvm' id='22'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <name>instance-0000002e</name>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <uuid>0adb6d19-d425-4600-9dd0-ca11095b3c59</uuid>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:53</nova:creationTime>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="ee2528cd-3da5-4a68-9377-ee66d47d9945">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='serial'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='uuid'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk' index='2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config' index='1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:b5:e2:ea'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tap88b42333-88'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:ee:8f:26'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tapee2528cd-3d'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:2a:43:1c'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tape4d9a52e-74'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:24:9a:48'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tap1fd711ff-37'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c753,c1000</label>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c753,c1000</imagelabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.065 2 INFO nova.virt.libvirt.driver [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully detached device tapee2528cd-3d from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the persistent domain config.#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.065 2 DEBUG nova.virt.libvirt.driver [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] (1/8): Attempting to detach device tapee2528cd-3d with device alias net1 from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.065 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:ee:8f:26"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <target dev="tapee2528cd-3d"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:09:56 np0005466012 kernel: tapee2528cd-3d (unregistering): left promiscuous mode
Oct  2 08:09:56 np0005466012 NetworkManager[51207]: <info>  [1759406996.1612] device (tapee2528cd-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:09:56 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:56Z|00189|binding|INFO|Releasing lport ee2528cd-3da5-4a68-9377-ee66d47d9945 from this chassis (sb_readonly=0)
Oct  2 08:09:56 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:56Z|00190|binding|INFO|Setting lport ee2528cd-3da5-4a68-9377-ee66d47d9945 down in Southbound
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:56Z|00191|binding|INFO|Removing iface tapee2528cd-3d ovn-installed in OVS
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.188 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Received event <DeviceRemovedEvent: 1759406996.187012, 0adb6d19-d425-4600-9dd0-ca11095b3c59 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.189 2 DEBUG nova.virt.libvirt.driver [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Start waiting for the detach event from libvirt for device tapee2528cd-3d with device alias net1 for instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.189 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.193 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface>not found in domain: <domain type='kvm' id='22'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <name>instance-0000002e</name>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <uuid>0adb6d19-d425-4600-9dd0-ca11095b3c59</uuid>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:53</nova:creationTime>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="ee2528cd-3da5-4a68-9377-ee66d47d9945">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='serial'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='uuid'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk' index='2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config' index='1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:b5:e2:ea'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tap88b42333-88'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:2a:43:1c'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tape4d9a52e-74'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:24:9a:48'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target dev='tap1fd711ff-37'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='net3'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c753,c1000</label>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c753,c1000</imagelabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.193 2 INFO nova.virt.libvirt.driver [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully detached device tapee2528cd-3d from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the live domain config.#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.195 2 DEBUG nova.virt.libvirt.vif [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.195 2 DEBUG nova.network.os_vif_util [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.196 2 DEBUG nova.network.os_vif_util [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.197 2 DEBUG os_vif [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.198 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee2528cd-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.204 2 INFO os_vif [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d')#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.204 2 DEBUG nova.virt.libvirt.guest [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:56</nova:creationTime>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:56 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:56 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:56 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.260 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:8f:26 10.100.0.5'], port_security=['fa:16:3e:ee:8f:26 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=ee2528cd-3da5-4a68-9377-ee66d47d9945) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.260 2 DEBUG nova.compute.manager [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.261 2 DEBUG oslo_concurrency.lockutils [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.261 2 DEBUG oslo_concurrency.lockutils [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.261 2 DEBUG oslo_concurrency.lockutils [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "272390c4-59b3-4d2c-bd09-9ceeffd7b19c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.261 2 DEBUG nova.compute.manager [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] No waiting events found dispatching network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.261 2 WARNING nova.compute.manager [req-986fecea-4218-411d-b17c-609a90d91302 req-9f18987b-fd5e-4af1-b438-f5a948c9e2f4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Received unexpected event network-vif-plugged-4f715879-b984-448f-b777-8f1883f96f4d for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.261 103246 INFO neutron.agent.ovn.metadata.agent [-] Port ee2528cd-3da5-4a68-9377-ee66d47d9945 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.263 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.277 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f7a50d-df75-4387-bfec-ebe8be93af88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.304 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9a610f5e-f709-4d4a-a5e0-9c912d57f9ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.306 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d852ae22-2ffe-458f-90d4-94decc46ba9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.334 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[a12c6141-81ff-4763-a88f-8229b56b1034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.353 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dbdbb643-ded7-4d0b-b1c2-4f864b0971ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 12, 'rx_bytes': 1084, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 12, 'rx_bytes': 1084, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226530, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.370 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dcff3d95-1d47-4f13-85ff-8fd811bbb660]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493835, 'tstamp': 493835}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226531, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493839, 'tstamp': 493839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226531, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.371 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:56 np0005466012 nova_compute[192063]: 2025-10-02 12:09:56.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.375 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.375 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.375 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:56.376 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.559 2 DEBUG nova.compute.manager [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-deleted-ee2528cd-3da5-4a68-9377-ee66d47d9945 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.559 2 INFO nova.compute.manager [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Neutron deleted interface ee2528cd-3da5-4a68-9377-ee66d47d9945; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.560 2 DEBUG nova.network.neutron [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.625 2 DEBUG oslo_concurrency.lockutils [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.626 2 DEBUG oslo_concurrency.lockutils [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.626 2 DEBUG nova.network.neutron [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.889 2 DEBUG nova.objects.instance [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'system_metadata' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.931 2 DEBUG nova.objects.instance [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'flavor' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.957 2 DEBUG nova.virt.libvirt.vif [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.958 2 DEBUG nova.network.os_vif_util [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.958 2 DEBUG nova.network.os_vif_util [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.961 2 DEBUG nova.virt.libvirt.guest [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.963 2 DEBUG nova.virt.libvirt.guest [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface>not found in domain: <domain type='kvm' id='22'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <name>instance-0000002e</name>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <uuid>0adb6d19-d425-4600-9dd0-ca11095b3c59</uuid>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:56</nova:creationTime>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='serial'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='uuid'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk' index='2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config' index='1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:b5:e2:ea'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='tap88b42333-88'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:2a:43:1c'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='tape4d9a52e-74'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='net2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:24:9a:48'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='tap1fd711ff-37'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='net3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c753,c1000</label>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c753,c1000</imagelabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.964 2 DEBUG nova.virt.libvirt.guest [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.967 2 DEBUG nova.virt.libvirt.guest [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ee:8f:26"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapee2528cd-3d"/></interface>not found in domain: <domain type='kvm' id='22'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <name>instance-0000002e</name>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <uuid>0adb6d19-d425-4600-9dd0-ca11095b3c59</uuid>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:56</nova:creationTime>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='serial'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='uuid'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk' index='2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config' index='1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:b5:e2:ea'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='tap88b42333-88'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:2a:43:1c'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='tape4d9a52e-74'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='net2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:24:9a:48'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target dev='tap1fd711ff-37'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='net3'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c753,c1000</label>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c753,c1000</imagelabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.967 2 WARNING nova.virt.libvirt.driver [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Detaching interface fa:16:3e:ee:8f:26 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapee2528cd-3d' not found.#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.968 2 DEBUG nova.virt.libvirt.vif [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.968 2 DEBUG nova.network.os_vif_util [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "address": "fa:16:3e:ee:8f:26", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee2528cd-3d", "ovs_interfaceid": "ee2528cd-3da5-4a68-9377-ee66d47d9945", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.968 2 DEBUG nova.network.os_vif_util [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.969 2 DEBUG os_vif [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.970 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee2528cd-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.972 2 INFO os_vif [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:8f:26,bridge_name='br-int',has_traffic_filtering=True,id=ee2528cd-3da5-4a68-9377-ee66d47d9945,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee2528cd-3d')#033[00m
Oct  2 08:09:57 np0005466012 nova_compute[192063]: 2025-10-02 12:09:57.972 2 DEBUG nova.virt.libvirt.guest [req-e2465026-fb0d-44aa-9224-433f82725935 req-7729a65d-d99c-4a4c-a37b-a379112b3840 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:09:57</nova:creationTime>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="e4d9a52e-74d6-4407-a156-6fa4bcad133c">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    <nova:port uuid="1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e">
Oct  2 08:09:57 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:09:57 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:09:57 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.154 2 DEBUG nova.compute.manager [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-unplugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.155 2 DEBUG oslo_concurrency.lockutils [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.155 2 DEBUG oslo_concurrency.lockutils [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.156 2 DEBUG oslo_concurrency.lockutils [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.156 2 DEBUG nova.compute.manager [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-unplugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.157 2 WARNING nova.compute.manager [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-unplugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.157 2 DEBUG nova.compute.manager [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.157 2 DEBUG oslo_concurrency.lockutils [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.157 2 DEBUG oslo_concurrency.lockutils [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.157 2 DEBUG oslo_concurrency.lockutils [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.157 2 DEBUG nova.compute.manager [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:09:58 np0005466012 nova_compute[192063]: 2025-10-02 12:09:58.158 2 WARNING nova.compute.manager [req-5254bb48-8b99-412a-a8ee-7b5f1f50bedf req-f2d03e93-3ee3-4182-aac4-6e96cd441754 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-ee2528cd-3da5-4a68-9377-ee66d47d9945 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.188 103246 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c6411337-6bef-48b5-9c0c-311e4ec6f5b9 with type ""#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.189 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:9a:48 10.100.0.4'], port_security=['fa:16:3e:24:9a:48 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2064859050', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2064859050', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.190 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.191 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00192|binding|INFO|Removing iface tap1fd711ff-37 ovn-installed in OVS
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00193|binding|INFO|Removing lport 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e ovn-installed in OVS
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.208 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a4e973e2-4f79-4eb2-98ca-b856ee96a416]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.219 2 INFO nova.network.neutron [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Port ee2528cd-3da5-4a68-9377-ee66d47d9945 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.237 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac04cdb-f2cd-4f0b-8e8d-139b074ae6bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.239 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb57c62-4db5-4274-b870-16039cad6095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.266 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8c9d8a-80b8-4b83-a549-dbc213603f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.282 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cd18efcd-fac8-4306-b0e2-09c70deab43f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 14, 'rx_bytes': 1084, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 14, 'rx_bytes': 1084, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226537, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.297 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dedbe486-28d6-41bd-9119-b6bd8efeb91b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493835, 'tstamp': 493835}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226538, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493839, 'tstamp': 493839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226538, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.299 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.301 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.301 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.302 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.302 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.348 2 DEBUG nova.compute.manager [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-deleted-1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.348 2 INFO nova.compute.manager [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Neutron deleted interface 1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.349 2 DEBUG nova.network.neutron [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.381 2 DEBUG nova.objects.instance [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'system_metadata' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.397 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.398 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.398 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.399 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.399 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.414 2 DEBUG nova.objects.instance [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'flavor' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.419 2 INFO nova.compute.manager [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Terminating instance#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.431 2 DEBUG nova.compute.manager [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.441 2 DEBUG nova.virt.libvirt.vif [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.442 2 DEBUG nova.network.os_vif_util [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.443 2 DEBUG nova.network.os_vif_util [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:59 np0005466012 kernel: tap88b42333-88 (unregistering): left promiscuous mode
Oct  2 08:09:59 np0005466012 NetworkManager[51207]: <info>  [1759406999.4701] device (tap88b42333-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00194|binding|INFO|Releasing lport 88b42333-8838-4199-ab64-5b879b907aa5 from this chassis (sb_readonly=0)
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00195|binding|INFO|Setting lport 88b42333-8838-4199-ab64-5b879b907aa5 down in Southbound
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00196|binding|INFO|Removing iface tap88b42333-88 ovn-installed in OVS
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.485 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:e2:ea 10.100.0.10'], port_security=['fa:16:3e:b5:e2:ea 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b489bfac-287c-4bcc-881b-f0347b4a08b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=88b42333-8838-4199-ab64-5b879b907aa5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.485 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 88b42333-8838-4199-ab64-5b879b907aa5 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.486 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 kernel: tape4d9a52e-74 (unregistering): left promiscuous mode
Oct  2 08:09:59 np0005466012 NetworkManager[51207]: <info>  [1759406999.5031] device (tape4d9a52e-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.504 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[26d050d6-2198-46e8-af25-e1bc726baec1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00197|binding|INFO|Releasing lport e4d9a52e-74d6-4407-a156-6fa4bcad133c from this chassis (sb_readonly=0)
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00198|binding|INFO|Setting lport e4d9a52e-74d6-4407-a156-6fa4bcad133c down in Southbound
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_controller[94284]: 2025-10-02T12:09:59Z|00199|binding|INFO|Removing iface tape4d9a52e-74 ovn-installed in OVS
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.527 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:43:1c 10.100.0.7'], port_security=['fa:16:3e:2a:43:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0adb6d19-d425-4600-9dd0-ca11095b3c59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=e4d9a52e-74d6-4407-a156-6fa4bcad133c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.530 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[412b0538-f8dc-4a06-9d71-4f315051a351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.534 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc8f6c9-67e2-4e4f-977d-230124d6f57b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 kernel: tap1fd711ff-37 (unregistering): left promiscuous mode
Oct  2 08:09:59 np0005466012 NetworkManager[51207]: <info>  [1759406999.5476] device (tap1fd711ff-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.565 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e3186502-b8b4-4582-a6fa-303c75edf4e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.583 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4d122286-7fb8-4f7e-9d8b-7fd13544fb30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 1084, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 1084, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493821, 'reachable_time': 17654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226563, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.598 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[30f92274-0184-4512-8a1a-4930e75a6c32]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493835, 'tstamp': 493835}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226564, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493839, 'tstamp': 493839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226564, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.599 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.609 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.610 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.610 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.611 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.611 103246 INFO neutron.agent.ovn.metadata.agent [-] Port e4d9a52e-74d6-4407-a156-6fa4bcad133c in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.613 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d845a33-56e0-4850-9f27-8a54095796f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:09:59 np0005466012 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.614 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5b54caba-6072-4a66-9e00-ad6fb6b0f2a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:09:59 np0005466012 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002e.scope: Consumed 15.363s CPU time.
Oct  2 08:09:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:09:59.614 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 namespace which is not needed anymore#033[00m
Oct  2 08:09:59 np0005466012 systemd-machined[152114]: Machine qemu-22-instance-0000002e terminated.
Oct  2 08:09:59 np0005466012 virtqemud[191783]: cannot parse process status data
Oct  2 08:09:59 np0005466012 NetworkManager[51207]: <info>  [1759406999.6489] manager: (tap88b42333-88): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 NetworkManager[51207]: <info>  [1759406999.6622] manager: (tape4d9a52e-74): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 NetworkManager[51207]: <info>  [1759406999.6742] manager: (tap1fd711ff-37): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.707 2 DEBUG nova.virt.libvirt.guest [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:9a:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1fd711ff-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.710 2 DEBUG nova.virt.libvirt.guest [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:9a:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1fd711ff-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.712 2 DEBUG nova.virt.libvirt.driver [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Attempting to detach device tap1fd711ff-37 from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.712 2 DEBUG nova.virt.libvirt.guest [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:24:9a:48"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <target dev="tap1fd711ff-37"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:59 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.713 2 INFO nova.virt.libvirt.driver [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Instance destroyed successfully.#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.713 2 DEBUG nova.objects.instance [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'resources' on Instance uuid 0adb6d19-d425-4600-9dd0-ca11095b3c59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.734 2 DEBUG nova.virt.libvirt.vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.735 2 DEBUG nova.network.os_vif_util [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.735 2 DEBUG nova.network.os_vif_util [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.735 2 DEBUG os_vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.737 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88b42333-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.746 2 INFO os_vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:e2:ea,bridge_name='br-int',has_traffic_filtering=True,id=88b42333-8838-4199-ab64-5b879b907aa5,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88b42333-88')#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.747 2 DEBUG nova.virt.libvirt.vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.747 2 DEBUG nova.network.os_vif_util [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.747 2 DEBUG nova.network.os_vif_util [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.748 2 DEBUG os_vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.749 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4d9a52e-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.754 2 INFO os_vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:43:1c,bridge_name='br-int',has_traffic_filtering=True,id=e4d9a52e-74d6-4407-a156-6fa4bcad133c,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4d9a52e-74')#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.755 2 DEBUG nova.virt.libvirt.vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.755 2 DEBUG nova.network.os_vif_util [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.756 2 DEBUG nova.network.os_vif_util [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.756 2 DEBUG os_vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.757 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1fd711ff-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.760 2 INFO os_vif [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37')#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.761 2 INFO nova.virt.libvirt.driver [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Deleting instance files /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59_del#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.761 2 INFO nova.virt.libvirt.driver [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Deletion of /var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59_del complete#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.804 2 DEBUG nova.virt.libvirt.guest [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:9a:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1fd711ff-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.807 2 DEBUG nova.virt.libvirt.guest [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:9a:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1fd711ff-37"/></interface>not found in domain: <domain type='kvm'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <name>instance-0000002e</name>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <uuid>0adb6d19-d425-4600-9dd0-ca11095b3c59</uuid>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1675134395</nova:name>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:09:03</nova:creationTime>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <nova:port uuid="88b42333-8838-4199-ab64-5b879b907aa5">
Oct  2 08:09:59 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <entry name='serial'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <entry name='uuid'>0adb6d19-d425-4600-9dd0-ca11095b3c59</entry>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='partial'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <model fallback='allow'>Nehalem</model>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/disk.config'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:b5:e2:ea'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target dev='tap88b42333-88'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:2a:43:1c'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target dev='tape4d9a52e-74'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <console type='pty'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/0adb6d19-d425-4600-9dd0-ca11095b3c59/console.log' append='off'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='-1' autoport='yes' listen='::0'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:09:59 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:09:59 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.807 2 INFO nova.virt.libvirt.driver [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Successfully detached device tap1fd711ff-37 from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the persistent domain config.#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.807 2 DEBUG nova.virt.libvirt.driver [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] (1/8): Attempting to detach device tap1fd711ff-37 with device alias None from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.807 2 DEBUG nova.virt.libvirt.guest [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:24:9a:48"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]:  <target dev="tap1fd711ff-37"/>
Oct  2 08:09:59 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:09:59 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.809 2 DEBUG nova.virt.libvirt.driver [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Libvirt returned error while detaching device tap1fd711ff-37 from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59. Libvirt error code: 55, error message: Requested operation is not valid: domain is not running. _detach_sync /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2667#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.809 2 WARNING nova.virt.libvirt.driver [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Unexpected libvirt error while detaching device tap1fd711ff-37 from instance 0adb6d19-d425-4600-9dd0-ca11095b3c59: Requested operation is not valid: domain is not running: libvirt.libvirtError: Requested operation is not valid: domain is not running#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.810 2 DEBUG nova.virt.libvirt.vif [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1675134395',display_name='tempest-AttachInterfacesTestJSON-server-1675134395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1675134395',id=46,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDdqza4DpitGNBFsk1Q1ZBqpWuxQf7qVLJPqXf+pHWH82X9+qdU/o7hAWgqY/ErxL3Rl1Xw+jsIYYSyqwATuq53eGoroF7gTmYiqDZwAwDjP4y2AyNbqs3iUfBisyTXBdQ==',key_name='tempest-keypair-143378812',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-d07ak9yh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:09:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=0adb6d19-d425-4600-9dd0-ca11095b3c59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.810 2 DEBUG nova.network.os_vif_util [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.811 2 DEBUG nova.network.os_vif_util [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.811 2 DEBUG os_vif [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.813 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1fd711ff-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.813 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.817 2 INFO os_vif [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9a:48,bridge_name='br-int',has_traffic_filtering=True,id=1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1fd711ff-37')#033[00m
Oct  2 08:09:59 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [NOTICE]   (225939) : haproxy version is 2.8.14-c23fe91
Oct  2 08:09:59 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [NOTICE]   (225939) : path to executable is /usr/sbin/haproxy
Oct  2 08:09:59 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [WARNING]  (225939) : Exiting Master process...
Oct  2 08:09:59 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [ALERT]    (225939) : Current worker (225941) exited with code 143 (Terminated)
Oct  2 08:09:59 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[225935]: [WARNING]  (225939) : All workers exited. Exiting... (0)
Oct  2 08:09:59 np0005466012 systemd[1]: libpod-efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757.scope: Deactivated successfully.
Oct  2 08:09:59 np0005466012 podman[226621]: 2025-10-02 12:09:59.836836647 +0000 UTC m=+0.132309892 container died efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.882 2 INFO nova.compute.manager [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.883 2 DEBUG oslo.service.loopingcall [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.883 2 DEBUG nova.compute.manager [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.883 2 DEBUG nova.network.neutron [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server [req-a783a831-fef5-4b75-9227-d0843bd16598 req-2943bd98-9e1f-48e6-9576-243fa5c50cb1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Exception during message handling: libvirt.libvirtError: Requested operation is not valid: domain is not running
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     raise self.value
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 11064, in external_instance_event
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self._process_instance_vif_deleted_event(context,
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10871, in _process_instance_vif_deleted_event
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self.driver.detach_interface(context, instance, vif)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 2943, in detach_interface
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self._detach_with_retry(
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 2473, in _detach_with_retry
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self._detach_from_live_with_retry(
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 2529, in _detach_from_live_with_retry
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self._detach_from_live_and_wait_for_event(
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 2591, in _detach_from_live_and_wait_for_event
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self._detach_sync(
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 2663, in _detach_sync
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     guest.detach_device(dev, persistent=persistent, live=live)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 466, in detach_device
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     self._domain.detachDeviceFlags(device_xml, flags=flags)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 193, in doit
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     result = proxy_call(self._autowrap, f, *args, **kwargs)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 151, in proxy_call
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     rv = execute(f, *args, **kwargs)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 132, in execute
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     six.reraise(c, e, tb)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     raise value
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 86, in tworker
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     rv = meth(*args, **kwargs)
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.9/site-packages/libvirt.py", line 1570, in detachDeviceFlags
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server     raise libvirtError('virDomainDetachDeviceFlags() failed')
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server libvirt.libvirtError: Requested operation is not valid: domain is not running
Oct  2 08:09:59 np0005466012 nova_compute[192063]: 2025-10-02 12:09:59.887 2 ERROR oslo_messaging.rpc.server #033[00m
Oct  2 08:09:59 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757-userdata-shm.mount: Deactivated successfully.
Oct  2 08:09:59 np0005466012 systemd[1]: var-lib-containers-storage-overlay-7dcb56641f67c59054348ab1508cb4642b6521b7bb43d114c4d6c83e0761d7ea-merged.mount: Deactivated successfully.
Oct  2 08:10:00 np0005466012 podman[226621]: 2025-10-02 12:10:00.011403377 +0000 UTC m=+0.306876602 container cleanup efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:10:00 np0005466012 systemd[1]: libpod-conmon-efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757.scope: Deactivated successfully.
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:00 np0005466012 podman[226662]: 2025-10-02 12:10:00.289445551 +0000 UTC m=+0.250167805 container remove efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.294 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[062c4169-6c88-47bb-a327-ec36bea510f8]: (4, ('Thu Oct  2 12:09:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 (efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757)\nefe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757\nThu Oct  2 12:10:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 (efe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757)\nefe5f0470f75210a09bc800dac27f8e51b4a25e88f386fad896771cdaa26a757\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.296 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aced7971-f3cf-4738-9c2d-abf4a203e562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.297 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:00 np0005466012 kernel: tap7d845a33-50: left promiscuous mode
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.313 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8f240c3b-8e7c-404b-afb2-0ed0bd490ac9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.346 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b3928a10-a779-49c5-b8b1-0ebca8fe0d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.347 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[86a3848a-1f6d-441d-85a1-6395089adcbf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.362 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d0b76d-bf49-4696-aae3-776d2b66a5c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493811, 'reachable_time': 23776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226677, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.364 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:10:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:00.364 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[c292ad26-8c2b-468d-91da-32d494e3647a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:00 np0005466012 systemd[1]: run-netns-ovnmeta\x2d7d845a33\x2d56e0\x2d4850\x2d9f27\x2d8a54095796f2.mount: Deactivated successfully.
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.829 2 DEBUG nova.compute.manager [req-14cdfbf0-7d8b-46f3-89eb-f6f8849fc2da req-3fe77884-de3a-41c3-b737-82f88c83fd0e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-deleted-e4d9a52e-74d6-4407-a156-6fa4bcad133c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.830 2 INFO nova.compute.manager [req-14cdfbf0-7d8b-46f3-89eb-f6f8849fc2da req-3fe77884-de3a-41c3-b737-82f88c83fd0e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Neutron deleted interface e4d9a52e-74d6-4407-a156-6fa4bcad133c; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.830 2 DEBUG nova.network.neutron [req-14cdfbf0-7d8b-46f3-89eb-f6f8849fc2da req-3fe77884-de3a-41c3-b737-82f88c83fd0e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:00 np0005466012 nova_compute[192063]: 2025-10-02 12:10:00.953 2 DEBUG nova.compute.manager [req-14cdfbf0-7d8b-46f3-89eb-f6f8849fc2da req-3fe77884-de3a-41c3-b737-82f88c83fd0e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Detach interface failed, port_id=e4d9a52e-74d6-4407-a156-6fa4bcad133c, reason: Instance 0adb6d19-d425-4600-9dd0-ca11095b3c59 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:10:01 np0005466012 nova_compute[192063]: 2025-10-02 12:10:01.160 2 DEBUG nova.network.neutron [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [{"id": "88b42333-8838-4199-ab64-5b879b907aa5", "address": "fa:16:3e:b5:e2:ea", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88b42333-88", "ovs_interfaceid": "88b42333-8838-4199-ab64-5b879b907aa5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "address": "fa:16:3e:2a:43:1c", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4d9a52e-74", "ovs_interfaceid": "e4d9a52e-74d6-4407-a156-6fa4bcad133c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "address": "fa:16:3e:24:9a:48", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd711ff-37", "ovs_interfaceid": "1fd711ff-378b-4ebf-bb64-6cc5b6cf9e1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:01 np0005466012 nova_compute[192063]: 2025-10-02 12:10:01.279 2 DEBUG oslo_concurrency.lockutils [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-0adb6d19-d425-4600-9dd0-ca11095b3c59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:01 np0005466012 nova_compute[192063]: 2025-10-02 12:10:01.515 2 DEBUG oslo_concurrency.lockutils [None req-e93dfd74-ba48-445f-aeb2-4b21495e652e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-0adb6d19-d425-4600-9dd0-ca11095b3c59-ee2528cd-3da5-4a68-9377-ee66d47d9945" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:02.120 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:02.121 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:02.121 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005466012 podman[226678]: 2025-10-02 12:10:02.148354488 +0000 UTC m=+0.061585529 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:10:02 np0005466012 podman[226679]: 2025-10-02 12:10:02.216989796 +0000 UTC m=+0.130116224 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.277 2 DEBUG nova.network.neutron [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.653 2 DEBUG nova.compute.manager [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-unplugged-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.654 2 DEBUG oslo_concurrency.lockutils [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.654 2 DEBUG oslo_concurrency.lockutils [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.654 2 DEBUG oslo_concurrency.lockutils [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.654 2 DEBUG nova.compute.manager [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-unplugged-88b42333-8838-4199-ab64-5b879b907aa5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.654 2 DEBUG nova.compute.manager [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-unplugged-88b42333-8838-4199-ab64-5b879b907aa5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.655 2 DEBUG nova.compute.manager [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.655 2 DEBUG oslo_concurrency.lockutils [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.655 2 DEBUG oslo_concurrency.lockutils [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.655 2 DEBUG oslo_concurrency.lockutils [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.655 2 DEBUG nova.compute.manager [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.656 2 WARNING nova.compute.manager [req-7c747ecb-1eaa-459b-bfcc-b69826584ce4 req-f4cff5ac-77da-42fc-9884-e1e30d632865 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-88b42333-8838-4199-ab64-5b879b907aa5 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.657 2 INFO nova.compute.manager [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Took 2.77 seconds to deallocate network for instance.#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.994 2 DEBUG nova.compute.manager [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-unplugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.995 2 DEBUG oslo_concurrency.lockutils [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.995 2 DEBUG oslo_concurrency.lockutils [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.995 2 DEBUG oslo_concurrency.lockutils [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.995 2 DEBUG nova.compute.manager [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-unplugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.996 2 DEBUG nova.compute.manager [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-unplugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.996 2 DEBUG nova.compute.manager [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.996 2 DEBUG oslo_concurrency.lockutils [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.996 2 DEBUG oslo_concurrency.lockutils [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.997 2 DEBUG oslo_concurrency.lockutils [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.997 2 DEBUG nova.compute.manager [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] No waiting events found dispatching network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:02 np0005466012 nova_compute[192063]: 2025-10-02 12:10:02.997 2 WARNING nova.compute.manager [req-766a4ea3-96f0-49fa-a040-798cfaf5eb5d req-5e87df62-57ff-424f-9f7e-0e7c3f6598e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received unexpected event network-vif-plugged-e4d9a52e-74d6-4407-a156-6fa4bcad133c for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.170 2 DEBUG nova.compute.manager [req-850792be-85e6-4375-a1ea-0f767ff08dec req-24db0e56-37bd-4ccf-b9a4-c4345b756980 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Received event network-vif-deleted-88b42333-8838-4199-ab64-5b879b907aa5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.361 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.362 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.421 2 DEBUG nova.compute.provider_tree [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.476 2 DEBUG nova.scheduler.client.report [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.600 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.645 2 INFO nova.scheduler.client.report [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Deleted allocations for instance 0adb6d19-d425-4600-9dd0-ca11095b3c59#033[00m
Oct  2 08:10:03 np0005466012 nova_compute[192063]: 2025-10-02 12:10:03.935 2 DEBUG oslo_concurrency.lockutils [None req-74e1d835-0afe-4851-a42d-f6c1e0f8b33e fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "0adb6d19-d425-4600-9dd0-ca11095b3c59" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:04 np0005466012 nova_compute[192063]: 2025-10-02 12:10:04.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:05 np0005466012 podman[226727]: 2025-10-02 12:10:05.136664726 +0000 UTC m=+0.058848625 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:10:05 np0005466012 nova_compute[192063]: 2025-10-02 12:10:05.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:08 np0005466012 nova_compute[192063]: 2025-10-02 12:10:08.263 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406993.2608402, 272390c4-59b3-4d2c-bd09-9ceeffd7b19c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:08 np0005466012 nova_compute[192063]: 2025-10-02 12:10:08.264 2 INFO nova.compute.manager [-] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:10:08 np0005466012 nova_compute[192063]: 2025-10-02 12:10:08.306 2 DEBUG nova.compute.manager [None req-8e1c5e48-dd55-41fd-b0ff-7e57208fbc7d - - - - - -] [instance: 272390c4-59b3-4d2c-bd09-9ceeffd7b19c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:09 np0005466012 podman[226745]: 2025-10-02 12:10:09.144454846 +0000 UTC m=+0.066142042 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:10:09 np0005466012 nova_compute[192063]: 2025-10-02 12:10:09.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:10 np0005466012 nova_compute[192063]: 2025-10-02 12:10:10.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.660 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.661 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.709 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.873 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.873 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.881 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:10:12 np0005466012 nova_compute[192063]: 2025-10-02 12:10:12.881 2 INFO nova.compute.claims [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.311 2 DEBUG nova.compute.provider_tree [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.350 2 DEBUG nova.scheduler.client.report [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.372 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.373 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.459 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.459 2 DEBUG nova.network.neutron [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.505 2 INFO nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.522 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.679 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.680 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.681 2 INFO nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Creating image(s)#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.681 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.681 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.682 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.694 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.748 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.749 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.749 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.759 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.812 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.814 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.910 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk 1073741824" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.911 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.911 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.972 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.973 2 DEBUG nova.virt.disk.api [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Checking if we can resize image /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:10:13 np0005466012 nova_compute[192063]: 2025-10-02 12:10:13.973 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.032 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.033 2 DEBUG nova.virt.disk.api [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Cannot resize image /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.033 2 DEBUG nova.objects.instance [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'migration_context' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.068 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.068 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Ensure instance console log exists: /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.069 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.069 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.070 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.293 2 DEBUG nova.policy [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.709 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759406999.7080185, 0adb6d19-d425-4600-9dd0-ca11095b3c59 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.710 2 INFO nova.compute.manager [-] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.759 2 DEBUG nova.compute.manager [None req-ebcbd982-6aad-4c75-bf15-a4e90fa27713 - - - - - -] [instance: 0adb6d19-d425-4600-9dd0-ca11095b3c59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.842 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.870 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:14 np0005466012 nova_compute[192063]: 2025-10-02 12:10:14.870 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:15 np0005466012 nova_compute[192063]: 2025-10-02 12:10:15.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:15 np0005466012 nova_compute[192063]: 2025-10-02 12:10:15.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:15 np0005466012 nova_compute[192063]: 2025-10-02 12:10:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:15 np0005466012 nova_compute[192063]: 2025-10-02 12:10:15.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:15 np0005466012 nova_compute[192063]: 2025-10-02 12:10:15.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:16 np0005466012 podman[226782]: 2025-10-02 12:10:16.139383383 +0000 UTC m=+0.054801797 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:10:16 np0005466012 podman[226783]: 2025-10-02 12:10:16.144626294 +0000 UTC m=+0.058177937 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64)
Oct  2 08:10:16 np0005466012 nova_compute[192063]: 2025-10-02 12:10:16.539 2 DEBUG nova.network.neutron [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Successfully created port: a2b38e2a-6b92-4e68-ae24-ea094847d75b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:10:16 np0005466012 nova_compute[192063]: 2025-10-02 12:10:16.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:17.065 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:17 np0005466012 nova_compute[192063]: 2025-10-02 12:10:17.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:17.067 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:10:17 np0005466012 nova_compute[192063]: 2025-10-02 12:10:17.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:17 np0005466012 nova_compute[192063]: 2025-10-02 12:10:17.872 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:17 np0005466012 nova_compute[192063]: 2025-10-02 12:10:17.872 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:17 np0005466012 nova_compute[192063]: 2025-10-02 12:10:17.873 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:17 np0005466012 nova_compute[192063]: 2025-10-02 12:10:17.873 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.017 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.018 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5731MB free_disk=73.42773056030273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.018 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.019 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.331 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.332 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.332 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.359 2 DEBUG nova.network.neutron [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Successfully updated port: a2b38e2a-6b92-4e68-ae24-ea094847d75b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.382 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.383 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.383 2 DEBUG nova.network.neutron [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.398 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.422 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.464 2 DEBUG nova.compute.manager [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.464 2 DEBUG nova.compute.manager [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing instance network info cache due to event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.465 2 DEBUG oslo_concurrency.lockutils [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.479 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.480 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:18 np0005466012 nova_compute[192063]: 2025-10-02 12:10:18.664 2 DEBUG nova.network.neutron [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:10:19 np0005466012 podman[226820]: 2025-10-02 12:10:19.138678706 +0000 UTC m=+0.057205261 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  2 08:10:19 np0005466012 podman[226821]: 2025-10-02 12:10:19.140026633 +0000 UTC m=+0.051793736 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:10:19 np0005466012 nova_compute[192063]: 2025-10-02 12:10:19.481 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:19 np0005466012 nova_compute[192063]: 2025-10-02 12:10:19.481 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:10:19 np0005466012 nova_compute[192063]: 2025-10-02 12:10:19.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:20 np0005466012 nova_compute[192063]: 2025-10-02 12:10:20.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:21.069 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:22 np0005466012 nova_compute[192063]: 2025-10-02 12:10:22.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:10:22 np0005466012 nova_compute[192063]: 2025-10-02 12:10:22.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:10:22 np0005466012 nova_compute[192063]: 2025-10-02 12:10:22.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:10:22 np0005466012 nova_compute[192063]: 2025-10-02 12:10:22.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:10:22 np0005466012 nova_compute[192063]: 2025-10-02 12:10:22.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.399 2 DEBUG nova.network.neutron [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.426 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.427 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Instance network_info: |[{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.427 2 DEBUG oslo_concurrency.lockutils [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.427 2 DEBUG nova.network.neutron [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.431 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Start _get_guest_xml network_info=[{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.435 2 WARNING nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.443 2 DEBUG nova.virt.libvirt.host [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.444 2 DEBUG nova.virt.libvirt.host [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.448 2 DEBUG nova.virt.libvirt.host [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.449 2 DEBUG nova.virt.libvirt.host [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.450 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.451 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.451 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.451 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.452 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.452 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.452 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.453 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.453 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.453 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.453 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.454 2 DEBUG nova.virt.hardware [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.457 2 DEBUG nova.virt.libvirt.vif [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.458 2 DEBUG nova.network.os_vif_util [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.459 2 DEBUG nova.network.os_vif_util [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.460 2 DEBUG nova.objects.instance [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.492 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <uuid>02e1c250-b902-42fe-a5cf-af66aa02e2bc</uuid>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <name>instance-00000034</name>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:name>tempest-tempest.common.compute-instance-2087694336</nova:name>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:10:23</nova:creationTime>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        <nova:port uuid="a2b38e2a-6b92-4e68-ae24-ea094847d75b">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <entry name="serial">02e1c250-b902-42fe-a5cf-af66aa02e2bc</entry>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <entry name="uuid">02e1c250-b902-42fe-a5cf-af66aa02e2bc</entry>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.config"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:e9:21:3e"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <target dev="tapa2b38e2a-6b"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/console.log" append="off"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:10:23 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:10:23 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:10:23 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:10:23 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.494 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Preparing to wait for external event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.494 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.495 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.495 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.496 2 DEBUG nova.virt.libvirt.vif [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.497 2 DEBUG nova.network.os_vif_util [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.498 2 DEBUG nova.network.os_vif_util [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.498 2 DEBUG os_vif [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.500 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.501 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.505 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b38e2a-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.505 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2b38e2a-6b, col_values=(('external_ids', {'iface-id': 'a2b38e2a-6b92-4e68-ae24-ea094847d75b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:21:3e', 'vm-uuid': '02e1c250-b902-42fe-a5cf-af66aa02e2bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:23 np0005466012 NetworkManager[51207]: <info>  [1759407023.5093] manager: (tapa2b38e2a-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.516 2 INFO os_vif [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b')#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.584 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.585 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.586 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:e9:21:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:10:23 np0005466012 nova_compute[192063]: 2025-10-02 12:10:23.587 2 INFO nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Using config drive#033[00m
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.333 2 INFO nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Creating config drive at /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.config#033[00m
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.342 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzvbhr9yt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.471 2 DEBUG oslo_concurrency.processutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzvbhr9yt" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:24 np0005466012 kernel: tapa2b38e2a-6b: entered promiscuous mode
Oct  2 08:10:24 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:24Z|00200|binding|INFO|Claiming lport a2b38e2a-6b92-4e68-ae24-ea094847d75b for this chassis.
Oct  2 08:10:24 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:24Z|00201|binding|INFO|a2b38e2a-6b92-4e68-ae24-ea094847d75b: Claiming fa:16:3e:e9:21:3e 10.100.0.5
Oct  2 08:10:24 np0005466012 NetworkManager[51207]: <info>  [1759407024.5370] manager: (tapa2b38e2a-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.566 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:21:3e 10.100.0.5'], port_security=['fa:16:3e:e9:21:3e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b3d6cff-45b6-4476-af05-0164bc00fd3b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=a2b38e2a-6b92-4e68-ae24-ea094847d75b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.567 103246 INFO neutron.agent.ovn.metadata.agent [-] Port a2b38e2a-6b92-4e68-ae24-ea094847d75b in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.568 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:10:24 np0005466012 systemd-udevd[226886]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.578 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[db1bed67-279a-420d-a66a-be0be11e92b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.579 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7d845a33-51 in ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.580 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7d845a33-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.581 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2c38d214-fc2e-401f-9754-76abaab24af7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.581 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[83e25dd4-eac9-4461-94b3-ac072858f8ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 NetworkManager[51207]: <info>  [1759407024.5921] device (tapa2b38e2a-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:10:24 np0005466012 NetworkManager[51207]: <info>  [1759407024.5931] device (tapa2b38e2a-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.592 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[c87d2021-82db-402f-ab98-d51dc2e60bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005466012 systemd-machined[152114]: New machine qemu-24-instance-00000034.
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:24Z|00202|binding|INFO|Setting lport a2b38e2a-6b92-4e68-ae24-ea094847d75b ovn-installed in OVS
Oct  2 08:10:24 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:24Z|00203|binding|INFO|Setting lport a2b38e2a-6b92-4e68-ae24-ea094847d75b up in Southbound
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005466012 systemd[1]: Started Virtual Machine qemu-24-instance-00000034.
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.618 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae3b563-111d-4108-b3d8-b3afeff3219c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.645 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d602e402-0a33-4bac-a2e1-a4859c450644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 NetworkManager[51207]: <info>  [1759407024.6587] manager: (tap7d845a33-50): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.657 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbcee90-0674-4aee-8d8f-0fa56ce4989f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.689 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[562b0b04-e33c-4e31-adf6-f3805382b500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.693 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e0163d04-3d30-4d11-a492-c546174b17ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 NetworkManager[51207]: <info>  [1759407024.7180] device (tap7d845a33-50): carrier: link connected
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.723 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b1449e19-843a-4aff-8f12-dca64fef3069]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.737 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[13706fe7-9b5a-4ddb-830f-5322e981f5f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501832, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226922, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.752 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f064e9-b7ae-4bd0-8011-32bef814eb5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:9016'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501832, 'tstamp': 501832}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226923, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.770 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed30402-5bad-4568-8210-c84596e194f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501832, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226924, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.801 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8110df37-5cc6-4dfb-9166-c5845e325431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.870 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[020dba92-9498-48db-ac0f-3c857da585ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.872 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.873 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.874 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:24 np0005466012 NetworkManager[51207]: <info>  [1759407024.9021] manager: (tap7d845a33-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct  2 08:10:24 np0005466012 kernel: tap7d845a33-50: entered promiscuous mode
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.905 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:10:24 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:24Z|00204|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.908 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.908 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[48367b92-9e6e-4d03-8f18-480da1f86570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.909 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-7d845a33-56e0-4850-9f27-8a54095796f2
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/7d845a33-56e0-4850-9f27-8a54095796f2.pid.haproxy
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 7d845a33-56e0-4850-9f27-8a54095796f2
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:10:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:10:24.910 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'env', 'PROCESS_TAG=haproxy-7d845a33-56e0-4850-9f27-8a54095796f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7d845a33-56e0-4850-9f27-8a54095796f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:10:24 np0005466012 nova_compute[192063]: 2025-10-02 12:10:24.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.146 2 DEBUG nova.compute.manager [req-3ab28770-2862-4968-9ac9-874e20ea7b37 req-648671fe-de2d-4f29-91cd-5efe4185b461 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.147 2 DEBUG oslo_concurrency.lockutils [req-3ab28770-2862-4968-9ac9-874e20ea7b37 req-648671fe-de2d-4f29-91cd-5efe4185b461 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.147 2 DEBUG oslo_concurrency.lockutils [req-3ab28770-2862-4968-9ac9-874e20ea7b37 req-648671fe-de2d-4f29-91cd-5efe4185b461 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.148 2 DEBUG oslo_concurrency.lockutils [req-3ab28770-2862-4968-9ac9-874e20ea7b37 req-648671fe-de2d-4f29-91cd-5efe4185b461 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.148 2 DEBUG nova.compute.manager [req-3ab28770-2862-4968-9ac9-874e20ea7b37 req-648671fe-de2d-4f29-91cd-5efe4185b461 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Processing event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.338 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407025.338338, 02e1c250-b902-42fe-a5cf-af66aa02e2bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.340 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] VM Started (Lifecycle Event)#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.342 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.346 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.350 2 INFO nova.virt.libvirt.driver [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Instance spawned successfully.#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.350 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:10:25 np0005466012 podman[226963]: 2025-10-02 12:10:25.302239465 +0000 UTC m=+0.032474176 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:10:25 np0005466012 podman[226963]: 2025-10-02 12:10:25.403046798 +0000 UTC m=+0.133281539 container create e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.469 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.472 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.566 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.567 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.567 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.568 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.568 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.569 2 DEBUG nova.virt.libvirt.driver [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.574 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.575 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407025.339177, 02e1c250-b902-42fe-a5cf-af66aa02e2bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.575 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:10:25 np0005466012 systemd[1]: Started libpod-conmon-e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f.scope.
Oct  2 08:10:25 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:10:25 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f52dfe8d7f8aecede4c357c2240aca35bccc3ab5c814290bdc5b8e87df8ebc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.622 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.627 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407025.3449028, 02e1c250-b902-42fe-a5cf-af66aa02e2bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.628 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:10:25 np0005466012 podman[226963]: 2025-10-02 12:10:25.72449975 +0000 UTC m=+0.454734501 container init e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.726 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.729 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:10:25 np0005466012 podman[226963]: 2025-10-02 12:10:25.734229043 +0000 UTC m=+0.464463754 container start e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:10:25 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [NOTICE]   (226983) : New worker (226985) forked
Oct  2 08:10:25 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [NOTICE]   (226983) : Loading success.
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.756 2 INFO nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Took 12.08 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.756 2 DEBUG nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:10:25 np0005466012 nova_compute[192063]: 2025-10-02 12:10:25.761 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:10:26 np0005466012 nova_compute[192063]: 2025-10-02 12:10:26.184 2 INFO nova.compute.manager [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Took 13.36 seconds to build instance.#033[00m
Oct  2 08:10:26 np0005466012 nova_compute[192063]: 2025-10-02 12:10:26.212 2 DEBUG oslo_concurrency.lockutils [None req-ed0a1db3-4036-4af3-940d-0f42ecf4357a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:26 np0005466012 nova_compute[192063]: 2025-10-02 12:10:26.245 2 DEBUG nova.network.neutron [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated VIF entry in instance network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:26 np0005466012 nova_compute[192063]: 2025-10-02 12:10:26.245 2 DEBUG nova.network.neutron [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:26 np0005466012 nova_compute[192063]: 2025-10-02 12:10:26.267 2 DEBUG oslo_concurrency.lockutils [req-99929415-0337-4302-be8d-ed17ff50bcb2 req-9bff6267-5447-4145-8cb6-eff3e1c5c41c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:27 np0005466012 nova_compute[192063]: 2025-10-02 12:10:27.303 2 DEBUG nova.compute.manager [req-8b3d3961-c9b1-4b3a-a3fe-d10a7f35a414 req-4dded4de-cb3a-4d9f-9dad-af60c9318885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:27 np0005466012 nova_compute[192063]: 2025-10-02 12:10:27.304 2 DEBUG oslo_concurrency.lockutils [req-8b3d3961-c9b1-4b3a-a3fe-d10a7f35a414 req-4dded4de-cb3a-4d9f-9dad-af60c9318885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:27 np0005466012 nova_compute[192063]: 2025-10-02 12:10:27.304 2 DEBUG oslo_concurrency.lockutils [req-8b3d3961-c9b1-4b3a-a3fe-d10a7f35a414 req-4dded4de-cb3a-4d9f-9dad-af60c9318885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:27 np0005466012 nova_compute[192063]: 2025-10-02 12:10:27.305 2 DEBUG oslo_concurrency.lockutils [req-8b3d3961-c9b1-4b3a-a3fe-d10a7f35a414 req-4dded4de-cb3a-4d9f-9dad-af60c9318885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:27 np0005466012 nova_compute[192063]: 2025-10-02 12:10:27.305 2 DEBUG nova.compute.manager [req-8b3d3961-c9b1-4b3a-a3fe-d10a7f35a414 req-4dded4de-cb3a-4d9f-9dad-af60c9318885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:10:27 np0005466012 nova_compute[192063]: 2025-10-02 12:10:27.305 2 WARNING nova.compute.manager [req-8b3d3961-c9b1-4b3a-a3fe-d10a7f35a414 req-4dded4de-cb3a-4d9f-9dad-af60c9318885 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received unexpected event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:10:28 np0005466012 nova_compute[192063]: 2025-10-02 12:10:28.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:30 np0005466012 nova_compute[192063]: 2025-10-02 12:10:30.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:31Z|00205|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:10:31 np0005466012 NetworkManager[51207]: <info>  [1759407031.0311] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct  2 08:10:31 np0005466012 NetworkManager[51207]: <info>  [1759407031.0328] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:31Z|00206|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.830 2 DEBUG nova.compute.manager [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.831 2 DEBUG nova.compute.manager [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing instance network info cache due to event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.831 2 DEBUG oslo_concurrency.lockutils [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.831 2 DEBUG oslo_concurrency.lockutils [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.831 2 DEBUG nova.network.neutron [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:31 np0005466012 nova_compute[192063]: 2025-10-02 12:10:31.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:33 np0005466012 podman[226995]: 2025-10-02 12:10:33.146507174 +0000 UTC m=+0.057227152 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:10:33 np0005466012 podman[226996]: 2025-10-02 12:10:33.173950312 +0000 UTC m=+0.084406983 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:10:33 np0005466012 nova_compute[192063]: 2025-10-02 12:10:33.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:35 np0005466012 nova_compute[192063]: 2025-10-02 12:10:35.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:36 np0005466012 podman[227048]: 2025-10-02 12:10:36.163656738 +0000 UTC m=+0.072697768 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Oct  2 08:10:36 np0005466012 nova_compute[192063]: 2025-10-02 12:10:36.722 2 DEBUG nova.network.neutron [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated VIF entry in instance network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:10:36 np0005466012 nova_compute[192063]: 2025-10-02 12:10:36.722 2 DEBUG nova.network.neutron [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:10:36 np0005466012 nova_compute[192063]: 2025-10-02 12:10:36.778 2 DEBUG oslo_concurrency.lockutils [req-93168d5f-e976-4e8c-8c29-4802a5d015c8 req-17d7b23f-b633-4637-9509-187f5665980c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:10:38 np0005466012 nova_compute[192063]: 2025-10-02 12:10:38.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:39 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:39Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:21:3e 10.100.0.5
Oct  2 08:10:39 np0005466012 ovn_controller[94284]: 2025-10-02T12:10:39Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:21:3e 10.100.0.5
Oct  2 08:10:40 np0005466012 podman[227089]: 2025-10-02 12:10:40.144923882 +0000 UTC m=+0.062582055 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct  2 08:10:40 np0005466012 nova_compute[192063]: 2025-10-02 12:10:40.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:40 np0005466012 nova_compute[192063]: 2025-10-02 12:10:40.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:43 np0005466012 nova_compute[192063]: 2025-10-02 12:10:43.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:43 np0005466012 nova_compute[192063]: 2025-10-02 12:10:43.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:44 np0005466012 nova_compute[192063]: 2025-10-02 12:10:44.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:45 np0005466012 nova_compute[192063]: 2025-10-02 12:10:45.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:47 np0005466012 podman[227110]: 2025-10-02 12:10:47.131078733 +0000 UTC m=+0.048826046 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:10:47 np0005466012 podman[227111]: 2025-10-02 12:10:47.164716278 +0000 UTC m=+0.078240897 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Oct  2 08:10:48 np0005466012 nova_compute[192063]: 2025-10-02 12:10:48.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:49 np0005466012 nova_compute[192063]: 2025-10-02 12:10:49.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:50 np0005466012 podman[227151]: 2025-10-02 12:10:50.174551425 +0000 UTC m=+0.076813448 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:10:50 np0005466012 podman[227150]: 2025-10-02 12:10:50.197929984 +0000 UTC m=+0.099942610 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:10:50 np0005466012 nova_compute[192063]: 2025-10-02 12:10:50.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:51 np0005466012 nova_compute[192063]: 2025-10-02 12:10:51.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:53 np0005466012 nova_compute[192063]: 2025-10-02 12:10:53.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:55 np0005466012 nova_compute[192063]: 2025-10-02 12:10:55.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:58 np0005466012 nova_compute[192063]: 2025-10-02 12:10:58.006 2 DEBUG nova.compute.manager [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:10:58 np0005466012 nova_compute[192063]: 2025-10-02 12:10:58.006 2 DEBUG nova.compute.manager [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing instance network info cache due to event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:10:58 np0005466012 nova_compute[192063]: 2025-10-02 12:10:58.006 2 DEBUG oslo_concurrency.lockutils [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:10:58 np0005466012 nova_compute[192063]: 2025-10-02 12:10:58.007 2 DEBUG oslo_concurrency.lockutils [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:10:58 np0005466012 nova_compute[192063]: 2025-10-02 12:10:58.007 2 DEBUG nova.network.neutron [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:10:58 np0005466012 nova_compute[192063]: 2025-10-02 12:10:58.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.245 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.245 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.273 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.402 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.403 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.411 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.412 2 INFO nova.compute.claims [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.605 2 DEBUG nova.compute.provider_tree [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.621 2 DEBUG nova.scheduler.client.report [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.649 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.650 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.717 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.718 2 DEBUG nova.network.neutron [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.737 2 INFO nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.764 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.909 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.911 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.911 2 INFO nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Creating image(s)#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.912 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "/var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.912 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "/var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.913 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "/var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.930 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.985 2 DEBUG nova.policy [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.988 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.989 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.989 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:10:59 np0005466012 nova_compute[192063]: 2025-10-02 12:10:59.999 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.053 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.054 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.094 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.095 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.096 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.151 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.152 2 DEBUG nova.virt.disk.api [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Checking if we can resize image /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.152 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.212 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.213 2 DEBUG nova.virt.disk.api [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Cannot resize image /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.213 2 DEBUG nova.objects.instance [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lazy-loading 'migration_context' on Instance uuid 0753ad57-d509-4a98-bba1-e9b29c087474 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.227 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.227 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Ensure instance console log exists: /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.227 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.228 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.228 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:00 np0005466012 nova_compute[192063]: 2025-10-02 12:11:00.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.111 2 DEBUG nova.compute.manager [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.112 2 DEBUG nova.compute.manager [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing instance network info cache due to event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.112 2 DEBUG oslo_concurrency.lockutils [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.182 2 DEBUG nova.network.neutron [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated VIF entry in instance network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.183 2 DEBUG nova.network.neutron [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.205 2 DEBUG oslo_concurrency.lockutils [req-428a3fde-fab2-4c6c-bdd2-a5e942d7510c req-377896e7-6f12-4b10-8b59-87510a71099a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.206 2 DEBUG oslo_concurrency.lockutils [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.206 2 DEBUG nova.network.neutron [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.486 2 DEBUG oslo_concurrency.lockutils [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-02e1c250-b902-42fe-a5cf-af66aa02e2bc-5c564602-5be7-47b0-858a-52eed7fcfd09" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.486 2 DEBUG oslo_concurrency.lockutils [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-02e1c250-b902-42fe-a5cf-af66aa02e2bc-5c564602-5be7-47b0-858a-52eed7fcfd09" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.487 2 DEBUG nova.objects.instance [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:01 np0005466012 nova_compute[192063]: 2025-10-02 12:11:01.661 2 DEBUG nova.network.neutron [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Successfully created port: 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:11:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:02.121 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:02.122 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:02.122 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:02 np0005466012 nova_compute[192063]: 2025-10-02 12:11:02.554 2 DEBUG nova.objects.instance [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'pci_requests' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:02 np0005466012 nova_compute[192063]: 2025-10-02 12:11:02.568 2 DEBUG nova.network.neutron [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:11:02 np0005466012 nova_compute[192063]: 2025-10-02 12:11:02.632 2 DEBUG nova.network.neutron [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Successfully updated port: 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:02 np0005466012 nova_compute[192063]: 2025-10-02 12:11:02.646 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:02 np0005466012 nova_compute[192063]: 2025-10-02 12:11:02.647 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquired lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:02 np0005466012 nova_compute[192063]: 2025-10-02 12:11:02.647 2 DEBUG nova.network.neutron [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:03 np0005466012 nova_compute[192063]: 2025-10-02 12:11:03.027 2 DEBUG nova.network.neutron [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:11:03 np0005466012 nova_compute[192063]: 2025-10-02 12:11:03.142 2 DEBUG nova.network.neutron [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated VIF entry in instance network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:03 np0005466012 nova_compute[192063]: 2025-10-02 12:11:03.143 2 DEBUG nova.network.neutron [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:03 np0005466012 nova_compute[192063]: 2025-10-02 12:11:03.176 2 DEBUG oslo_concurrency.lockutils [req-4fac8c90-a11b-4f09-bad4-2c574c277cb9 req-d0245613-00d8-4482-a99a-f9447d516d5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:03 np0005466012 nova_compute[192063]: 2025-10-02 12:11:03.208 2 DEBUG nova.policy [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:11:03 np0005466012 nova_compute[192063]: 2025-10-02 12:11:03.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005466012 podman[227209]: 2025-10-02 12:11:04.159571138 +0000 UTC m=+0.072527343 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.165 2 DEBUG nova.network.neutron [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updating instance_info_cache with network_info: [{"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.184 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Releasing lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.185 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Instance network_info: |[{"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.187 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Start _get_guest_xml network_info=[{"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.192 2 WARNING nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.197 2 DEBUG nova.virt.libvirt.host [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.197 2 DEBUG nova.virt.libvirt.host [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:11:04 np0005466012 podman[227210]: 2025-10-02 12:11:04.200380097 +0000 UTC m=+0.117638557 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.210 2 DEBUG nova.virt.libvirt.host [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.211 2 DEBUG nova.virt.libvirt.host [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.212 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.212 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.213 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.213 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.213 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.214 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.214 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.214 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.215 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.215 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.215 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.216 2 DEBUG nova.virt.hardware [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.220 2 DEBUG nova.virt.libvirt.vif [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1120877738',display_name='tempest-ServerActionsTestOtherA-server-1120877738',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1120877738',id=56,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxV3/3UVj3iLiv8GHkV/x6VYyYPVFG5yThfPAWdnRtPLxRt5nk8D+Dtcmc6m48b1gfoKmcnooDopojNsfnOakPU7WA24nbcaEk0vNw9hR38BD9zJ2a+hy7fQOi0lwh9QA==',key_name='tempest-keypair-1208414249',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20417475a6a149d5bc47976f4da9a4ae',ramdisk_id='',reservation_id='r-ooa2t6mh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-352727288',owner_user_name='tempest-ServerActionsTestOtherA-352727288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c2b9eab3da414692b3942505e3441920',uuid=0753ad57-d509-4a98-bba1-e9b29c087474,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.220 2 DEBUG nova.network.os_vif_util [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converting VIF {"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.221 2 DEBUG nova.network.os_vif_util [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.222 2 DEBUG nova.objects.instance [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 0753ad57-d509-4a98-bba1-e9b29c087474 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.235 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <uuid>0753ad57-d509-4a98-bba1-e9b29c087474</uuid>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <name>instance-00000038</name>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerActionsTestOtherA-server-1120877738</nova:name>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:11:04</nova:creationTime>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:user uuid="c2b9eab3da414692b3942505e3441920">tempest-ServerActionsTestOtherA-352727288-project-member</nova:user>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:project uuid="20417475a6a149d5bc47976f4da9a4ae">tempest-ServerActionsTestOtherA-352727288</nova:project>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        <nova:port uuid="7332d0b6-e5f0-41e2-aa18-69453b2d2b21">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <entry name="serial">0753ad57-d509-4a98-bba1-e9b29c087474</entry>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <entry name="uuid">0753ad57-d509-4a98-bba1-e9b29c087474</entry>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.config"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:1b:69:e3"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <target dev="tap7332d0b6-e5"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/console.log" append="off"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:11:04 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:11:04 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:11:04 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:11:04 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.237 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Preparing to wait for external event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.237 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.238 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.238 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.239 2 DEBUG nova.virt.libvirt.vif [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1120877738',display_name='tempest-ServerActionsTestOtherA-server-1120877738',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1120877738',id=56,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxV3/3UVj3iLiv8GHkV/x6VYyYPVFG5yThfPAWdnRtPLxRt5nk8D+Dtcmc6m48b1gfoKmcnooDopojNsfnOakPU7WA24nbcaEk0vNw9hR38BD9zJ2a+hy7fQOi0lwh9QA==',key_name='tempest-keypair-1208414249',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20417475a6a149d5bc47976f4da9a4ae',ramdisk_id='',reservation_id='r-ooa2t6mh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-352727288',owner_user_name='tempest-ServerActionsTestOtherA-352727288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:10:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c2b9eab3da414692b3942505e3441920',uuid=0753ad57-d509-4a98-bba1-e9b29c087474,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.240 2 DEBUG nova.network.os_vif_util [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converting VIF {"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.240 2 DEBUG nova.network.os_vif_util [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.241 2 DEBUG os_vif [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.242 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.243 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.247 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7332d0b6-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.248 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7332d0b6-e5, col_values=(('external_ids', {'iface-id': '7332d0b6-e5f0-41e2-aa18-69453b2d2b21', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:69:e3', 'vm-uuid': '0753ad57-d509-4a98-bba1-e9b29c087474'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005466012 NetworkManager[51207]: <info>  [1759407064.2508] manager: (tap7332d0b6-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.259 2 INFO os_vif [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5')#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.312 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.312 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.312 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] No VIF found with MAC fa:16:3e:1b:69:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.313 2 INFO nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Using config drive#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.963 2 INFO nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Creating config drive at /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.config#033[00m
Oct  2 08:11:04 np0005466012 nova_compute[192063]: 2025-10-02 12:11:04.968 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpipyb74ra execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.096 2 DEBUG oslo_concurrency.processutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpipyb74ra" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:05 np0005466012 kernel: tap7332d0b6-e5: entered promiscuous mode
Oct  2 08:11:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:05Z|00207|binding|INFO|Claiming lport 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 for this chassis.
Oct  2 08:11:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:05Z|00208|binding|INFO|7332d0b6-e5f0-41e2-aa18-69453b2d2b21: Claiming fa:16:3e:1b:69:e3 10.100.0.12
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 NetworkManager[51207]: <info>  [1759407065.1583] manager: (tap7332d0b6-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.161 2 DEBUG nova.network.neutron [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Successfully updated port: 5c564602-5be7-47b0-858a-52eed7fcfd09 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.167 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:69:e3 10.100.0.12'], port_security=['fa:16:3e:1b:69:e3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20417475a6a149d5bc47976f4da9a4ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5535fb48-d673-47c4-b26e-f6f2718957b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8a937e8-285b-47d1-b87a-47c75465be5a, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7332d0b6-e5f0-41e2-aa18-69453b2d2b21) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.169 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 in datapath 2bdfd186-139e-456a-92e9-4dc9c37a846a bound to our chassis#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.170 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2bdfd186-139e-456a-92e9-4dc9c37a846a#033[00m
Oct  2 08:11:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:05Z|00209|binding|INFO|Setting lport 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 ovn-installed in OVS
Oct  2 08:11:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:05Z|00210|binding|INFO|Setting lport 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 up in Southbound
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.175 2 DEBUG oslo_concurrency.lockutils [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.175 2 DEBUG oslo_concurrency.lockutils [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.176 2 DEBUG nova.network.neutron [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.185 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a3329d4d-e94b-4625-9ea7-de40d26a2c51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.186 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2bdfd186-11 in ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.187 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2bdfd186-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.187 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9dba83b0-dfa4-4fa8-abc1-c763cb01db63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.188 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4a50e263-e3a8-44c9-8de5-63b1c898ac44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 systemd-udevd[227280]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:05 np0005466012 systemd-machined[152114]: New machine qemu-25-instance-00000038.
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.202 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[9358346f-5bba-4131-b620-df7cb8c632d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 NetworkManager[51207]: <info>  [1759407065.2082] device (tap7332d0b6-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:11:05 np0005466012 NetworkManager[51207]: <info>  [1759407065.2090] device (tap7332d0b6-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:11:05 np0005466012 systemd[1]: Started Virtual Machine qemu-25-instance-00000038.
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.216 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7759e0f2-1228-4c0b-9aef-721ec86399ae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.242 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[df382a79-335e-493e-bcb7-24f4ac5b5d43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 NetworkManager[51207]: <info>  [1759407065.2466] manager: (tap2bdfd186-10): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.248 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[653e7712-0d26-49f6-8cff-a632809a94ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.285 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[cb025831-3244-47cc-bb69-73d45e8b289a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.289 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f14ac8db-9063-4a5b-923a-0dc94feae8ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 NetworkManager[51207]: <info>  [1759407065.3136] device (tap2bdfd186-10): carrier: link connected
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.319 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e20a1e73-252e-4caf-9f7e-a0d6029b47e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.341 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0a50a5d3-672f-49e5-b54e-0b4c4c63794b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bdfd186-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:b7:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505892, 'reachable_time': 17879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227311, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.357 2 DEBUG nova.compute.manager [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-changed-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.358 2 DEBUG nova.compute.manager [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing instance network info cache due to event network-changed-5c564602-5be7-47b0-858a-52eed7fcfd09. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.358 2 DEBUG oslo_concurrency.lockutils [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.366 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ccc3207-1ffc-4169-9995-b065c143d531]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:b789'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505892, 'tstamp': 505892}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227312, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.383 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[86c88249-b564-49c3-bb84-0c58e6750838]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bdfd186-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:b7:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505892, 'reachable_time': 17879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227313, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.403 2 WARNING nova.network.neutron [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] 7d845a33-56e0-4850-9f27-8a54095796f2 already exists in list: networks containing: ['7d845a33-56e0-4850-9f27-8a54095796f2']. ignoring it#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.420 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7f19c4f1-0eff-4c87-8e73-30e255ffac05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.490 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca75132-13a2-46e8-bdfc-a5dee79a814c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.491 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bdfd186-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.492 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.492 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2bdfd186-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:05 np0005466012 NetworkManager[51207]: <info>  [1759407065.4946] manager: (tap2bdfd186-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct  2 08:11:05 np0005466012 kernel: tap2bdfd186-10: entered promiscuous mode
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.497 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2bdfd186-10, col_values=(('external_ids', {'iface-id': '1e2d82b4-a363-4c19-94d1-e62c1ba8e34a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:05Z|00211|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.519 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2bdfd186-139e-456a-92e9-4dc9c37a846a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2bdfd186-139e-456a-92e9-4dc9c37a846a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.520 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3a03edbb-7f5b-4c08-9d88-675653a019a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.521 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-2bdfd186-139e-456a-92e9-4dc9c37a846a
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/2bdfd186-139e-456a-92e9-4dc9c37a846a.pid.haproxy
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 2bdfd186-139e-456a-92e9-4dc9c37a846a
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:11:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:05.522 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'env', 'PROCESS_TAG=haproxy-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2bdfd186-139e-456a-92e9-4dc9c37a846a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.585 2 DEBUG nova.compute.manager [req-496ec1fb-a15c-421a-96c2-75e0f10cfdce req-af0ca04c-8002-4a95-9c5d-4330cabb7895 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.586 2 DEBUG oslo_concurrency.lockutils [req-496ec1fb-a15c-421a-96c2-75e0f10cfdce req-af0ca04c-8002-4a95-9c5d-4330cabb7895 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.586 2 DEBUG oslo_concurrency.lockutils [req-496ec1fb-a15c-421a-96c2-75e0f10cfdce req-af0ca04c-8002-4a95-9c5d-4330cabb7895 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.586 2 DEBUG oslo_concurrency.lockutils [req-496ec1fb-a15c-421a-96c2-75e0f10cfdce req-af0ca04c-8002-4a95-9c5d-4330cabb7895 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:05 np0005466012 nova_compute[192063]: 2025-10-02 12:11:05.586 2 DEBUG nova.compute.manager [req-496ec1fb-a15c-421a-96c2-75e0f10cfdce req-af0ca04c-8002-4a95-9c5d-4330cabb7895 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Processing event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:11:05 np0005466012 podman[227351]: 2025-10-02 12:11:05.913369776 +0000 UTC m=+0.059916173 container create b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:11:05 np0005466012 systemd[1]: Started libpod-conmon-b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae.scope.
Oct  2 08:11:05 np0005466012 podman[227351]: 2025-10-02 12:11:05.875869457 +0000 UTC m=+0.022415844 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:11:05 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:11:05 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b5655323f2097bda224bea88ef8f023634f3733c5c2d2c668210e03f13adc7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:11:06 np0005466012 podman[227351]: 2025-10-02 12:11:06.005748793 +0000 UTC m=+0.152295170 container init b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:11:06 np0005466012 podman[227351]: 2025-10-02 12:11:06.012598268 +0000 UTC m=+0.159144635 container start b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:11:06 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [NOTICE]   (227371) : New worker (227373) forked
Oct  2 08:11:06 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [NOTICE]   (227371) : Loading success.
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.058 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.059 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407066.0575502, 0753ad57-d509-4a98-bba1-e9b29c087474 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.059 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] VM Started (Lifecycle Event)#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.062 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.065 2 INFO nova.virt.libvirt.driver [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Instance spawned successfully.#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.066 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.082 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.092 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.093 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.094 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.095 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.096 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.097 2 DEBUG nova.virt.libvirt.driver [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.102 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.138 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.139 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407066.058564, 0753ad57-d509-4a98-bba1-e9b29c087474 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.139 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.179 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.185 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407066.0605435, 0753ad57-d509-4a98-bba1-e9b29c087474 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.185 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.190 2 INFO nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Took 6.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.190 2 DEBUG nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.214 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.219 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.245 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.458 2 INFO nova.compute.manager [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Took 7.09 seconds to build instance.#033[00m
Oct  2 08:11:06 np0005466012 nova_compute[192063]: 2025-10-02 12:11:06.555 2 DEBUG oslo_concurrency.lockutils [None req-74d8866f-e25c-4166-9f73-926e31a558b9 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:07 np0005466012 podman[227382]: 2025-10-02 12:11:07.139720027 +0000 UTC m=+0.057779017 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.536 2 DEBUG nova.compute.manager [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-changed-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.537 2 DEBUG nova.compute.manager [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Refreshing instance network info cache due to event network-changed-7332d0b6-e5f0-41e2-aa18-69453b2d2b21. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.538 2 DEBUG oslo_concurrency.lockutils [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.539 2 DEBUG oslo_concurrency.lockutils [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.539 2 DEBUG nova.network.neutron [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Refreshing network info cache for port 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.694 2 DEBUG nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.694 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.694 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.695 2 DEBUG oslo_concurrency.lockutils [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.695 2 DEBUG nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] No waiting events found dispatching network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:07 np0005466012 nova_compute[192063]: 2025-10-02 12:11:07.695 2 WARNING nova.compute.manager [req-24f74341-8e36-40dd-a896-21504c75bd7f req-6e4b9bd4-c06e-4ef9-a85d-b0792b297383 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received unexpected event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.386 2 DEBUG nova.network.neutron [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.413 2 DEBUG oslo_concurrency.lockutils [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.414 2 DEBUG oslo_concurrency.lockutils [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.414 2 DEBUG nova.network.neutron [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing network info cache for port 5c564602-5be7-47b0-858a-52eed7fcfd09 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.420 2 DEBUG nova.virt.libvirt.vif [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.421 2 DEBUG nova.network.os_vif_util [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.421 2 DEBUG nova.network.os_vif_util [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.422 2 DEBUG os_vif [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.423 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.423 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.425 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c564602-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.426 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c564602-5b, col_values=(('external_ids', {'iface-id': '5c564602-5be7-47b0-858a-52eed7fcfd09', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:3c:22', 'vm-uuid': '02e1c250-b902-42fe-a5cf-af66aa02e2bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 NetworkManager[51207]: <info>  [1759407068.4284] manager: (tap5c564602-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.435 2 INFO os_vif [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b')#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.435 2 DEBUG nova.virt.libvirt.vif [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.436 2 DEBUG nova.network.os_vif_util [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.436 2 DEBUG nova.network.os_vif_util [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.439 2 DEBUG nova.virt.libvirt.guest [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:2d:3c:22"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <target dev="tap5c564602-5b"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:11:08 np0005466012 nova_compute[192063]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:11:08 np0005466012 kernel: tap5c564602-5b: entered promiscuous mode
Oct  2 08:11:08 np0005466012 NetworkManager[51207]: <info>  [1759407068.4574] manager: (tap5c564602-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:08Z|00212|binding|INFO|Claiming lport 5c564602-5be7-47b0-858a-52eed7fcfd09 for this chassis.
Oct  2 08:11:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:08Z|00213|binding|INFO|5c564602-5be7-47b0-858a-52eed7fcfd09: Claiming fa:16:3e:2d:3c:22 10.100.0.7
Oct  2 08:11:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:08Z|00214|binding|INFO|Setting lport 5c564602-5be7-47b0-858a-52eed7fcfd09 ovn-installed in OVS
Oct  2 08:11:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:08Z|00215|binding|INFO|Setting lport 5c564602-5be7-47b0-858a-52eed7fcfd09 up in Southbound
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.470 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:3c:22 10.100.0.7'], port_security=['fa:16:3e:2d:3c:22 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5c564602-5be7-47b0-858a-52eed7fcfd09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.473 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5c564602-5be7-47b0-858a-52eed7fcfd09 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 bound to our chassis#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.479 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:11:08 np0005466012 systemd-udevd[227409]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.496 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7379db52-8848-4de9-a099-434f02a33825]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005466012 NetworkManager[51207]: <info>  [1759407068.5055] device (tap5c564602-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:11:08 np0005466012 NetworkManager[51207]: <info>  [1759407068.5062] device (tap5c564602-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.543 2 DEBUG nova.virt.libvirt.driver [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.544 2 DEBUG nova.virt.libvirt.driver [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.544 2 DEBUG nova.virt.libvirt.driver [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:e9:21:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.544 2 DEBUG nova.virt.libvirt.driver [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] No VIF found with MAC fa:16:3e:2d:3c:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.549 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d55a1064-f37f-48a5-82e0-b3799b4ff286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.552 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[798814b2-0fac-44ad-b3df-f678c2f85ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.577 2 DEBUG nova.virt.libvirt.guest [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:name>tempest-tempest.common.compute-instance-2087694336</nova:name>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:11:08</nova:creationTime>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:port uuid="a2b38e2a-6b92-4e68-ae24-ea094847d75b">
Oct  2 08:11:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    <nova:port uuid="5c564602-5be7-47b0-858a-52eed7fcfd09">
Oct  2 08:11:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:08 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:11:08 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:11:08 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.580 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca04c7d-12ac-41e4-a014-eba8bdd453c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.597 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3072659b-8b6f-4087-96ff-958e111298d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501832, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227416, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.612 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1551b4-7f08-4953-b353-a051394a08c7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501843, 'tstamp': 501843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227417, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501847, 'tstamp': 501847}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227417, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.614 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.616 2 DEBUG oslo_concurrency.lockutils [None req-508f8f84-6b45-4c81-a413-0b59637d84bc fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-02e1c250-b902-42fe-a5cf-af66aa02e2bc-5c564602-5be7-47b0-858a-52eed7fcfd09" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.617 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.617 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:08 np0005466012 nova_compute[192063]: 2025-10-02 12:11:08.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.618 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:08.618 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.223 2 DEBUG nova.network.neutron [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updated VIF entry in instance network info cache for port 7332d0b6-e5f0-41e2-aa18-69453b2d2b21. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.224 2 DEBUG nova.network.neutron [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updating instance_info_cache with network_info: [{"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.242 2 DEBUG oslo_concurrency.lockutils [req-229ab225-9d92-462d-8aa1-5c0cdd69fe98 req-7a2641d0-fa44-4f9f-9e57-b08954385584 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.456 2 DEBUG oslo_concurrency.lockutils [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "interface-02e1c250-b902-42fe-a5cf-af66aa02e2bc-5c564602-5be7-47b0-858a-52eed7fcfd09" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.457 2 DEBUG oslo_concurrency.lockutils [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-02e1c250-b902-42fe-a5cf-af66aa02e2bc-5c564602-5be7-47b0-858a-52eed7fcfd09" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.480 2 DEBUG nova.objects.instance [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'flavor' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.532 2 DEBUG nova.virt.libvirt.vif [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.533 2 DEBUG nova.network.os_vif_util [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.533 2 DEBUG nova.network.os_vif_util [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.536 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.537 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.539 2 DEBUG nova.virt.libvirt.driver [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Attempting to detach device tap5c564602-5b from instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.539 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:2d:3c:22"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <target dev="tap5c564602-5b"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.557 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.562 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface>not found in domain: <domain type='kvm' id='24'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <name>instance-00000034</name>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <uuid>02e1c250-b902-42fe-a5cf-af66aa02e2bc</uuid>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:name>tempest-tempest.common.compute-instance-2087694336</nova:name>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:11:08</nova:creationTime>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:port uuid="a2b38e2a-6b92-4e68-ae24-ea094847d75b">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:port uuid="5c564602-5be7-47b0-858a-52eed7fcfd09">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='serial'>02e1c250-b902-42fe-a5cf-af66aa02e2bc</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='uuid'>02e1c250-b902-42fe-a5cf-af66aa02e2bc</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk' index='2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.config' index='1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:e9:21:3e'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='tapa2b38e2a-6b'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:2d:3c:22'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='tap5c564602-5b'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='net1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/console.log' append='off'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/console.log' append='off'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c197,c381</label>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c197,c381</imagelabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.563 2 INFO nova.virt.libvirt.driver [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully detached device tap5c564602-5b from instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc from the persistent domain config.#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.563 2 DEBUG nova.virt.libvirt.driver [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] (1/8): Attempting to detach device tap5c564602-5b with device alias net1 from instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.564 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:2d:3c:22"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <target dev="tap5c564602-5b"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:11:09 np0005466012 kernel: tap5c564602-5b (unregistering): left promiscuous mode
Oct  2 08:11:09 np0005466012 NetworkManager[51207]: <info>  [1759407069.6097] device (tap5c564602-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:09Z|00216|binding|INFO|Releasing lport 5c564602-5be7-47b0-858a-52eed7fcfd09 from this chassis (sb_readonly=0)
Oct  2 08:11:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:09Z|00217|binding|INFO|Setting lport 5c564602-5be7-47b0-858a-52eed7fcfd09 down in Southbound
Oct  2 08:11:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:09Z|00218|binding|INFO|Removing iface tap5c564602-5b ovn-installed in OVS
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.619 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Received event <DeviceRemovedEvent: 1759407069.617783, 02e1c250-b902-42fe-a5cf-af66aa02e2bc => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.624 2 DEBUG nova.virt.libvirt.driver [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Start waiting for the detach event from libvirt for device tap5c564602-5b with device alias net1 for instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.624 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.625 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:3c:22 10.100.0.7'], port_security=['fa:16:3e:2d:3c:22 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-126745253', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e26b972b-3ab5-401c-9d8b-5161665ba680', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5c564602-5be7-47b0-858a-52eed7fcfd09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.627 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5c564602-5be7-47b0-858a-52eed7fcfd09 in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.628 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d845a33-56e0-4850-9f27-8a54095796f2#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.633 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2d:3c:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5c564602-5b"/></interface>not found in domain: <domain type='kvm' id='24'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <name>instance-00000034</name>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <uuid>02e1c250-b902-42fe-a5cf-af66aa02e2bc</uuid>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:name>tempest-tempest.common.compute-instance-2087694336</nova:name>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:11:08</nova:creationTime>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:port uuid="a2b38e2a-6b92-4e68-ae24-ea094847d75b">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:port uuid="5c564602-5be7-47b0-858a-52eed7fcfd09">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='serial'>02e1c250-b902-42fe-a5cf-af66aa02e2bc</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='uuid'>02e1c250-b902-42fe-a5cf-af66aa02e2bc</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk' index='2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.config' index='1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:e9:21:3e'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target dev='tapa2b38e2a-6b'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/console.log' append='off'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/console.log' append='off'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c197,c381</label>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c197,c381</imagelabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.634 2 INFO nova.virt.libvirt.driver [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully detached device tap5c564602-5b from instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc from the live domain config.#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.634 2 DEBUG nova.virt.libvirt.vif [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.635 2 DEBUG nova.network.os_vif_util [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.635 2 DEBUG nova.network.os_vif_util [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.635 2 DEBUG os_vif [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c564602-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.642 2 INFO os_vif [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:3c:22,bridge_name='br-int',has_traffic_filtering=True,id=5c564602-5be7-47b0-858a-52eed7fcfd09,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5c564602-5b')#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.643 2 DEBUG nova.virt.libvirt.guest [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:name>tempest-tempest.common.compute-instance-2087694336</nova:name>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:11:09</nova:creationTime>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:user uuid="fbc7616089cb4f78832692487019c83d">tempest-AttachInterfacesTestJSON-812274278-project-member</nova:user>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:project uuid="ef4e3be787374d90a6a236c7f76bd940">tempest-AttachInterfacesTestJSON-812274278</nova:project>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    <nova:port uuid="a2b38e2a-6b92-4e68-ae24-ea094847d75b">
Oct  2 08:11:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:11:09 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:11:09 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.646 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f038f92d-68a6-49ca-bde7-324676191e7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.680 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[07ec589f-9c81-411f-9728-d53dd868ef03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.683 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f852b3b7-e10b-4347-9eff-70864beecd10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.708 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[867d10f9-64f6-4684-af2d-cbc1ab922d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.726 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d6e1b4-4e5f-41e0-92cb-30bfda4199fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d845a33-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:90:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501832, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227427, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.745 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[951e4ac8-4397-4110-989b-a185cc28bdda]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501843, 'tstamp': 501843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227428, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7d845a33-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501847, 'tstamp': 501847}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227428, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.747 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.785 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d845a33-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.786 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.786 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d845a33-50, col_values=(('external_ids', {'iface-id': '1c321c19-d630-4a6f-8ba8-7bac90af9bae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:09.786 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.816 2 DEBUG nova.compute.manager [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.816 2 DEBUG oslo_concurrency.lockutils [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.817 2 DEBUG oslo_concurrency.lockutils [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.817 2 DEBUG oslo_concurrency.lockutils [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.818 2 DEBUG nova.compute.manager [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.818 2 WARNING nova.compute.manager [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received unexpected event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.818 2 DEBUG nova.compute.manager [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.818 2 DEBUG oslo_concurrency.lockutils [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.818 2 DEBUG oslo_concurrency.lockutils [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.819 2 DEBUG oslo_concurrency.lockutils [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.819 2 DEBUG nova.compute.manager [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:09 np0005466012 nova_compute[192063]: 2025-10-02 12:11:09.819 2 WARNING nova.compute.manager [req-faeb39c6-c5fa-4c14-b18e-855a1b1eee50 req-f8f57e7d-2e20-4166-8abd-d95db53b1934 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received unexpected event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.264 2 DEBUG nova.network.neutron [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated VIF entry in instance network info cache for port 5c564602-5be7-47b0-858a-52eed7fcfd09. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.264 2 DEBUG nova.network.neutron [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5c564602-5be7-47b0-858a-52eed7fcfd09", "address": "fa:16:3e:2d:3c:22", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c564602-5b", "ovs_interfaceid": "5c564602-5be7-47b0-858a-52eed7fcfd09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.284 2 DEBUG oslo_concurrency.lockutils [req-7ca6ba04-3884-4a16-8987-fac23e063a38 req-5b724561-f39d-49fd-84fa-6ab58e6a3f52 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.435 2 DEBUG nova.compute.manager [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-changed-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.436 2 DEBUG nova.compute.manager [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Refreshing instance network info cache due to event network-changed-7332d0b6-e5f0-41e2-aa18-69453b2d2b21. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.436 2 DEBUG oslo_concurrency.lockutils [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.436 2 DEBUG oslo_concurrency.lockutils [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:10 np0005466012 nova_compute[192063]: 2025-10-02 12:11:10.436 2 DEBUG nova.network.neutron [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Refreshing network info cache for port 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:11 np0005466012 podman[227429]: 2025-10-02 12:11:11.149747817 +0000 UTC m=+0.065934896 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.251 2 DEBUG oslo_concurrency.lockutils [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.252 2 DEBUG oslo_concurrency.lockutils [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.252 2 DEBUG nova.network.neutron [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.975 2 DEBUG nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-unplugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.976 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.976 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.977 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.977 2 DEBUG nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-unplugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.977 2 WARNING nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received unexpected event network-vif-unplugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.977 2 DEBUG nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.977 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.978 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.978 2 DEBUG oslo_concurrency.lockutils [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.978 2 DEBUG nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:11 np0005466012 nova_compute[192063]: 2025-10-02 12:11:11.978 2 WARNING nova.compute.manager [req-7d87bffb-6351-4980-b327-be6b2b78c601 req-158911f7-b938-41e4-b5d5-65b304a919fa 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received unexpected event network-vif-plugged-5c564602-5be7-47b0-858a-52eed7fcfd09 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.596 2 DEBUG nova.network.neutron [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updated VIF entry in instance network info cache for port 7332d0b6-e5f0-41e2-aa18-69453b2d2b21. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.597 2 DEBUG nova.network.neutron [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updating instance_info_cache with network_info: [{"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.601 2 INFO nova.network.neutron [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Port 5c564602-5be7-47b0-858a-52eed7fcfd09 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.601 2 DEBUG nova.network.neutron [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.625 2 DEBUG oslo_concurrency.lockutils [req-c3ac9b92-56c0-4c85-822c-d68ab9fa130f req-c386ca91-dcba-4a66-a00c-0e4e377a5ccf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.626 2 DEBUG oslo_concurrency.lockutils [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:12 np0005466012 nova_compute[192063]: 2025-10-02 12:11:12.648 2 DEBUG oslo_concurrency.lockutils [None req-dd9a9f2f-5fc4-4064-b1cd-cf0bed678d80 fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "interface-02e1c250-b902-42fe-a5cf-af66aa02e2bc-5c564602-5be7-47b0-858a-52eed7fcfd09" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.210 2 DEBUG nova.compute.manager [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.210 2 DEBUG nova.compute.manager [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing instance network info cache due to event network-changed-a2b38e2a-6b92-4e68-ae24-ea094847d75b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.211 2 DEBUG oslo_concurrency.lockutils [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.211 2 DEBUG oslo_concurrency.lockutils [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.211 2 DEBUG nova.network.neutron [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Refreshing network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:14 np0005466012 nova_compute[192063]: 2025-10-02 12:11:14.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:15 np0005466012 nova_compute[192063]: 2025-10-02 12:11:15.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:15 np0005466012 nova_compute[192063]: 2025-10-02 12:11:15.323 2 DEBUG nova.network.neutron [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated VIF entry in instance network info cache for port a2b38e2a-6b92-4e68-ae24-ea094847d75b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:11:15 np0005466012 nova_compute[192063]: 2025-10-02 12:11:15.323 2 DEBUG nova.network.neutron [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:15 np0005466012 nova_compute[192063]: 2025-10-02 12:11:15.342 2 DEBUG oslo_concurrency.lockutils [req-94ed9c02-e963-4234-80ba-c6e0e4e680cc req-cf61ab29-7a63-4b93-9866-ae2845b6e085 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:15 np0005466012 nova_compute[192063]: 2025-10-02 12:11:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.922 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '20417475a6a149d5bc47976f4da9a4ae', 'user_id': 'c2b9eab3da414692b3942505e3441920', 'hostId': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.925 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'name': 'tempest-tempest.common.compute-instance-2087694336', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000034', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ef4e3be787374d90a6a236c7f76bd940', 'user_id': 'fbc7616089cb4f78832692487019c83d', 'hostId': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.937 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.937 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.947 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.947 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cfaed3d-ce2c-4c0b-976b-d134dad80614', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:16.925966', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5dde3c0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.602325857, 'message_signature': 'c1b1689daed4b45b43110610051f018827700b03c109b13f2a246bf9ffdc1e30'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:16.925966', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5ddef96-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.602325857, 'message_signature': 'de3fe3b2c22742036b453b0e8fc0bb6725d26a1a0697b6e50173493dbc4e5212'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:16.925966', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5df6970-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.614578588, 'message_signature': '47aed505079d03a088cab116ca47c8aa644a59670f707b308d11f4a9f89bc102'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:16.925966', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5df747e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.614578588, 'message_signature': '1794afc290e9348de1cb353b8ceb013408e8c50abaf09ff70c9dd77c518ba8f8'}]}, 'timestamp': '2025-10-02 12:11:16.948220', '_unique_id': '016c7af91a494674823d7c81391f6a4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.949 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.952 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0753ad57-d509-4a98-bba1-e9b29c087474 / tap7332d0b6-e5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.953 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.955 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 02e1c250-b902-42fe-a5cf-af66aa02e2bc / tapa2b38e2a-6b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.955 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09114ff3-66af-486f-a86d-b373ed11a0ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:16.950633', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5e04156-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'a6d833c74f1f98553b2108b859047e64886fd34b0af07b92c1cfbb70cf64bc46'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:16.950633', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5e08f76-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '985cfea1550e1a1b5de0b175c31f87b51242e16dcddb2a2934d069285e5e81ad'}]}, 'timestamp': '2025-10-02 12:11:16.955463', '_unique_id': '8c6ce8beb527489cbfdc9ddd9e8c8537'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.956 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.957 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.957 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75083a37-044e-43dd-a67b-09b24c7efdea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:16.957191', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5e0dd96-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'a90486fdc830fab0adcdd3c1be6dff0ca666223ae02571f8d7a8318774741eaa'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:16.957191', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5e0e750-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': 'f5d4b7e87c2828f9ed91a036c45ebcb3554d437b4ee3ed4e3dc8d555016ac226'}]}, 'timestamp': '2025-10-02 12:11:16.957725', '_unique_id': '9458574a2cab4a039d64105c3fe9359b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.959 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.959 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>]
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.959 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.977 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.978 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.993 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.write.latency volume: 10842178050 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.994 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3892b2fa-46fa-4d4e-b32e-e64297f2a5e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:16.959447', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e4000c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': 'e01365157f33c3378c40154d9b450d7019c0f846049f805403bdf2a4db2948a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:16.959447', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e40aca-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '31cb94749005eff4fe8e87d72bf19f34e94e61be98f9950c0739643c6a0fd798'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10842178050, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:16.959447', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e67940-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '7fd1bd4c22cb06d7e29f15b069d11154ec3f5dc1d82a6babfda89aa02a9d6ba8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:16.959447', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e6841c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '1172e0ccf7c9cf3c9cd9124ea6c874467a3f248a640849bf3d4cf832d3c7df30'}]}, 'timestamp': '2025-10-02 12:11:16.994483', '_unique_id': 'ab3ea25888174b0fb185110be3319513'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.996 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.996 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.996 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.write.bytes volume: 72990720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.997 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85932bc0-30b8-4f19-a1f7-45d33a7e8a1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:16.996447', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e6daca-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '23b85b659cf30e23940b244e2b432b352468c0f3bd1c3f2ed5caac75f3df6cb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:16.996447', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e6e57e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '520f42e02c0912f3d0167db89b67c7093046a7b0a3c7e8ff43e6361f891fe0a5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72990720, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:16.996447', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e6efba-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': 'bac51c18c52b9f2f005c999fd1aaeadff718a45e76fcee94fdf9eea102ee9b2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:16.996447', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e6f8d4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': 'dfd3c3af027da30b336cbfdf000c2b904ea2ac0f6bd59b4c0eefe3e1b5e0a580'}]}, 'timestamp': '2025-10-02 12:11:16.997447', '_unique_id': '52623335bb1a41eba200119dba5faac6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.998 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>]
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.999 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.999 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.999 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:16.999 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4e1eaf7-d399-4e66-aef3-58761589de42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:16.999095', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e74258-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '695364b5e3f20808b9f0071fa50b7271b4dc92a26d083083be6e12047f32aeb8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:16.999095', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e74bcc-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': 'eff193bef89aa39e500bcaea90f704df71ce6ada5bb51d5cc10dd2e5d5e7db47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:16.999095', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e75572-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '9b9771b414f0624e64dd0239729b4200a728d5b1e53e83ec3fc770128951e913'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:16.999095', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e75eaa-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '9a2452ed1046e18a1dfdfd50dccc06721afd74c0ada412f45825392dcf6f75b8'}]}, 'timestamp': '2025-10-02 12:11:17.000058', '_unique_id': '7bfeb6ff6a05469c9a1e6b802a54d99c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.000 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.001 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.read.latency volume: 416471094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.001 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.read.latency volume: 2973530 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.001 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.read.latency volume: 2014721662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.read.latency volume: 96641476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '515a5dc9-c802-4392-800e-01e6446da936', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 416471094, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:17.001340', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e79a14-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '0632828cc612468dd5cbdedf85c4bab2d9266394388e6531c8652a609db6b678'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2973530, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:17.001340', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e7a32e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '5eafbe529b9a87840fe3409563e36f0c7df41b353d87a2db612bb804821bdc7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2014721662, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:17.001340', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5e7ac84-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': 'd9aa0f5c92defbfd0ec8241b560285df7fb55f324e4516d0dc94d0634db4f7e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 96641476, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:17.001340', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5e7b562-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '4aa1d9280df458edc27cb82c8a011deb6ab627bd8b4ad80cfa25862ae2bd8f1a'}]}, 'timestamp': '2025-10-02 12:11:17.002282', '_unique_id': '5ac1a0fc519a4aafba8d153897db0e6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.002 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.003 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.017 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/cpu volume: 10690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.033 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/cpu volume: 13630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a91bf98f-1c2f-4095-9851-85731e1501eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10690000000, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'timestamp': '2025-10-02T12:11:17.003542', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e5ea1faa-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.693983615, 'message_signature': '1373cd156629bd7ceff1db22dfd5ab6bcb4d301a87b8e47e343b95bc2d1296a6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13630000000, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'timestamp': '2025-10-02T12:11:17.003542', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e5ec8da8-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.709916054, 'message_signature': '7ee6de360cbf1df22afd61d643f7c017fa1829433556151e39061cdf541215d2'}]}, 'timestamp': '2025-10-02 12:11:17.034087', '_unique_id': 'c0e871eba4d44ba08161565e27acd60f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.036 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.036 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05cd7989-c46f-446f-b593-b0602f3ab28a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.036003', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5ece44c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'c927dd3250c8ec60bd32368280649f94ea84b6922d6a4caefebc486b2012b95d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.036003', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5ecedf2-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '86af3580ef213412e706214ca3952a6cf0b089be349580be2c7a3554a8042624'}]}, 'timestamp': '2025-10-02 12:11:17.036529', '_unique_id': '78dbc5dab8fb4e42a31883d36bb65605'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.037 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.038 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b227508-d487-424a-8de5-fda41eda78f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.037941', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5ed2fec-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'b5b0d7b76377fe122bd820a5106c403e0e490cb7c22baeb4a128b27503620b39'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.037941', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5ed3a6e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '565b03c732cdf57cfcb460f787fde0e3d3761a1ea379f6b0d95af81a977b28b2'}]}, 'timestamp': '2025-10-02 12:11:17.038481', '_unique_id': 'be2aa14d0f154a00b86add2a189fbc70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.039 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.incoming.bytes volume: 4385 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a413fbd-0a12-4933-9a6e-7fc2325dcacc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.039772', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5ed775e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'edfe936882f1e17cb7dfc9f2eef400b68b0628933014a014be70ce1c950350d8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4385, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.039772', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5ed805a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '4c82aa5c9f18ff124d13b735580cc336d1c58a07ea7da1a50bb43da1b95e4c6c'}]}, 'timestamp': '2025-10-02 12:11:17.040255', '_unique_id': '723e75622f044b65b00517c3001cd1ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.041 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.041 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.041 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>]
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.041 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.041 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.041 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1120877738>, <NovaLikeServer: tempest-tempest.common.compute-instance-2087694336>]
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.042 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.042 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.outgoing.bytes volume: 3324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '905b877e-7d24-4f84-babb-b827cac9689b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.042194', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5edd596-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': '1577e00c43b5b10e0f392b50c12d7fd2ef600f8b58fa7ea2d7cc4b00045e276d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3324, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.042194', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5eddf1e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '5401f180c90431ccfeb5575485288670c23ff7de91a71e9b10922a242618201b'}]}, 'timestamp': '2025-10-02 12:11:17.042680', '_unique_id': 'c1bcf41e51404a12aedff662fad3617e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.043 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3ac473e-55e0-48b4-82cc-4c163a9a4ca0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.043922', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5ee18e4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': '04ed240432079e2f4de417d985033d4b4108d68b03120a58453f6b3c0100c4dc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.043922', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5ee2244-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '5fa3638a5940809950a811aecbadd2e761eeb7c691bfa886aa4fd3f3b6c0ada2'}]}, 'timestamp': '2025-10-02 12:11:17.044391', '_unique_id': 'bcb863beeca94cba87bae9568f01e514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.044 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.045 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.045 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.046 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.046 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f7c55b2-8417-4572-a6f8-565c76eac0b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:17.045677', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5ee5ef8-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': 'ad5655d85eead2fdd5fa882b25266014f41a61edef1ffd967235ecdd18210f1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:17.045677', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5ee692a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': 'a5e34eaf906f547d6d403d63f8929ea2f5e5bc0722de7a8a1ce65ebac47f7702'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:17.045677', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5ee71cc-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '2b5f318d7462152c1928c1c3312763e50ddce2aba7fa2f989beef229e927e0b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:17.045677', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5ee7b40-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '4ae9133b5322ee9d72789db5b232f81258c417fe0a1350021ec5b4001f89718a'}]}, 'timestamp': '2025-10-02 12:11:17.046672', '_unique_id': '225e5225877f4266ada5de0303d3c878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.047 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.048 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.048 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.048 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '875bec46-bddd-41cf-ad19-736ea4c5bbcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:17.047970', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5eeb786-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.602325857, 'message_signature': '4fee2f957c40153dd8b0ed94a8de467148a8999365581268f131825b326fe6a4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:17.047970', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5eec0b4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.602325857, 'message_signature': '27f9d369a8a952c9c8ce787aab032c580411c00352abbf6d5cac7aba75fc6103'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:17.047970', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5eec8d4-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.614578588, 'message_signature': '558309e36c1798108fc950a8f5e62ed2ab87db8393e8edb59d3a6c550b21598e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:17.047970', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5eed27a-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.614578588, 'message_signature': '9e99874b2354585aa34d89926c64224da90b7a07a00cde1891982a49759b5eef'}]}, 'timestamp': '2025-10-02 12:11:17.048899', '_unique_id': '6e19ad6f306244f8a35d2f8a349cc2e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.049 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.050 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.050 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19616877-52e1-4787-8c32-93a0d092572b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.050212', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5ef0ec0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'd177f79b72c0bd34cfc5fb075564e92037916bbb2d0dcedf52d0dd12a481a0e6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.050212', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5ef1802-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': '87af620894d1112627bd48d62ab82501fbf88ac64e826f7d53d14924a5dc0225'}]}, 'timestamp': '2025-10-02 12:11:17.050682', '_unique_id': '59f9dd935ef74e49bf7d34a56f9e49b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.051 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.052 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.052 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.052 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.052 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e16b838f-0619-4396-9b2c-fde5c2788130', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:17.052133', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5ef59c0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '9275ae1a7db647f74ef329f06093d9d5c4951d5dce2baf4e4ff480b7e7e3eefc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:17.052133', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5ef633e-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.636018305, 'message_signature': '7527eceba42a57321d259fb3af3a48c2a3dfba4268530e6439a5902620fa5325'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:17.052133', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5ef6c1c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': 'b43c734bd4f7e25880e3c704fe790004180e9cdf57fa07fc8ddf0c2d5e9a706c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:17.052133', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5ef748c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.654609705, 'message_signature': '5cc0b1aac39285cb7d025e005f9756b39813be97bf9fa4aef5085964a671ec76'}]}, 'timestamp': '2025-10-02 12:11:17.053084', '_unique_id': '53805f2a39f54f0d9a46c0304ebd9d90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.053 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.054 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.054 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a75e01d-c6da-4eee-a660-eaec1289fde0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.054371', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5efb104-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'ac3eeb9d8fde1d8e44640dca87c2c57b602780bd2a781f771a8042770bbe1f4d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.054371', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5efbdde-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': 'cb0e1ecd237842dce05c645e5b308dca2e5ad1d8fcd56c8a89bac29bc6cdb0b2'}]}, 'timestamp': '2025-10-02 12:11:17.054957', '_unique_id': '7eea186241db49db8c893d64d445e142'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.055 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.056 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.056 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.056 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/memory.usage volume: 42.7890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cd24ca6-41de-4110-befb-889d48fe8eba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'timestamp': '2025-10-02T12:11:17.056356', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e5effec0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.693983615, 'message_signature': '5c1020395f7f023af65ecaaa451065f62b7d222cd5b332516f74eacb19cb8792'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7890625, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'timestamp': '2025-10-02T12:11:17.056356', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e5f00898-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.709916054, 'message_signature': '0283f0c7486d75251d424e95fabd9887c99e4826ee5ba1d0e3f76d7957774996'}]}, 'timestamp': '2025-10-02 12:11:17.056852', '_unique_id': 'aa989cb151694b36b14cdb671f10ed20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.057 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.058 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.058 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.058 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.058 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f8c38bc-ed9f-4340-ae99-5bcfbf3f5599', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-vda', 'timestamp': '2025-10-02T12:11:17.058110', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5f04358-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.602325857, 'message_signature': '30def5843873db82083384341350cf8468a1cb08eca2bd09d20225edd366d41a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': '0753ad57-d509-4a98-bba1-e9b29c087474-sda', 'timestamp': '2025-10-02T12:11:17.058110', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'instance-00000038', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5f04c2c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.602325857, 'message_signature': 'bc0c43dd91b1e098a89f074d13fd4727c3d27fd00a626c51ed578e76ff02ee07'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-vda', 'timestamp': '2025-10-02T12:11:17.058110', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e5f054b0-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.614578588, 'message_signature': 'a207c3db798e5c0a2a06e7a5d0b8328e76252801e9fce1ec654877fff6aaff7b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc-sda', 'timestamp': '2025-10-02T12:11:17.058110', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'instance-00000034', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e5f05e4c-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.614578588, 'message_signature': '695039e3133abef6f8e82d73922993cbec3c522fe2a65ce6b6f3833d6e8b0cb4'}]}, 'timestamp': '2025-10-02 12:11:17.059027', '_unique_id': '682df26394634dde898db0516fd49f3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.059 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.062 12 DEBUG ceilometer.compute.pollsters [-] 0753ad57-d509-4a98-bba1-e9b29c087474/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.062 12 DEBUG ceilometer.compute.pollsters [-] 02e1c250-b902-42fe-a5cf-af66aa02e2bc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2371395-2cec-4488-989d-1088eb772bf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2b9eab3da414692b3942505e3441920', 'user_name': None, 'project_id': '20417475a6a149d5bc47976f4da9a4ae', 'project_name': None, 'resource_id': 'instance-00000038-0753ad57-d509-4a98-bba1-e9b29c087474-tap7332d0b6-e5', 'timestamp': '2025-10-02T12:11:17.062181', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1120877738', 'name': 'tap7332d0b6-e5', 'instance_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'instance_type': 'm1.nano', 'host': '3d0822a6d98a0cadf762bfcd70e7b56c3db875ad3a6c369cfc7c3a5c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:69:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7332d0b6-e5'}, 'message_id': 'e5f0ec86-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.627023582, 'message_signature': 'be1ef1137748cbd0fe6ea5d024fc18e8cd79856885b3c0bfc1aad06d92c86e15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fbc7616089cb4f78832692487019c83d', 'user_name': None, 'project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'project_name': None, 'resource_id': 'instance-00000034-02e1c250-b902-42fe-a5cf-af66aa02e2bc-tapa2b38e2a-6b', 'timestamp': '2025-10-02T12:11:17.062181', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-2087694336', 'name': 'tapa2b38e2a-6b', 'instance_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'instance_type': 'm1.nano', 'host': '06d9c0747aa13331a0864756210d1dfede7933e5186586e8c4ecbe35', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:21:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2b38e2a-6b'}, 'message_id': 'e5f0fea6-9f88-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5070.629781867, 'message_signature': 'bbd6b57a60bc2aee341d0813351e31b5c6281cd7612e0bc2d874d968c1e2bc5c'}]}, 'timestamp': '2025-10-02 12:11:17.063194', '_unique_id': '36d11bd3abf148cc84acfe3a1f4f7cc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:11:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:11:17.064 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:11:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:17.415 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:17.416 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:11:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:17.417 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.861 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.861 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.861 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.862 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:11:17 np0005466012 nova_compute[192063]: 2025-10-02 12:11:17.944 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:17 np0005466012 podman[227469]: 2025-10-02 12:11:17.977935682 +0000 UTC m=+0.068140904 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  2 08:11:17 np0005466012 podman[227470]: 2025-10-02 12:11:17.977944703 +0000 UTC m=+0.067452617 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64)
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.008 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.009 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.066 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.073 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.136 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.137 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.194 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:18Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:69:e3 10.100.0.12
Oct  2 08:11:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:18Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:69:e3 10.100.0.12
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.361 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.363 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5369MB free_disk=73.37273025512695GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.363 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.363 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.434 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.434 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 0753ad57-d509-4a98-bba1-e9b29c087474 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.435 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.435 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.459 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.485 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.486 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.501 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.539 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.615 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.632 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.667 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:11:18 np0005466012 nova_compute[192063]: 2025-10-02 12:11:18.667 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:19 np0005466012 nova_compute[192063]: 2025-10-02 12:11:19.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:20 np0005466012 nova_compute[192063]: 2025-10-02 12:11:20.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:20 np0005466012 nova_compute[192063]: 2025-10-02 12:11:20.666 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:20 np0005466012 nova_compute[192063]: 2025-10-02 12:11:20.666 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:11:21 np0005466012 podman[227519]: 2025-10-02 12:11:21.127649035 +0000 UTC m=+0.050688226 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:11:21 np0005466012 podman[227520]: 2025-10-02 12:11:21.138393674 +0000 UTC m=+0.053969843 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:11:24 np0005466012 nova_compute[192063]: 2025-10-02 12:11:24.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:24 np0005466012 nova_compute[192063]: 2025-10-02 12:11:24.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:11:24 np0005466012 nova_compute[192063]: 2025-10-02 12:11:24.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:11:24 np0005466012 nova_compute[192063]: 2025-10-02 12:11:24.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:11:25 np0005466012 nova_compute[192063]: 2025-10-02 12:11:25.097 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:25 np0005466012 nova_compute[192063]: 2025-10-02 12:11:25.098 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:25 np0005466012 nova_compute[192063]: 2025-10-02 12:11:25.098 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:11:25 np0005466012 nova_compute[192063]: 2025-10-02 12:11:25.098 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:25 np0005466012 nova_compute[192063]: 2025-10-02 12:11:25.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:26Z|00219|binding|INFO|Releasing lport 1c321c19-d630-4a6f-8ba8-7bac90af9bae from this chassis (sb_readonly=0)
Oct  2 08:11:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:26Z|00220|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:11:26 np0005466012 nova_compute[192063]: 2025-10-02 12:11:26.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:27 np0005466012 nova_compute[192063]: 2025-10-02 12:11:27.073 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [{"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:27 np0005466012 nova_compute[192063]: 2025-10-02 12:11:27.094 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-02e1c250-b902-42fe-a5cf-af66aa02e2bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:11:27 np0005466012 nova_compute[192063]: 2025-10-02 12:11:27.095 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:11:29 np0005466012 nova_compute[192063]: 2025-10-02 12:11:29.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.355 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.355 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.356 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.356 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.356 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.371 2 INFO nova.compute.manager [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Terminating instance#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.383 2 DEBUG nova.compute.manager [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:11:30 np0005466012 kernel: tapa2b38e2a-6b (unregistering): left promiscuous mode
Oct  2 08:11:30 np0005466012 NetworkManager[51207]: <info>  [1759407090.4153] device (tapa2b38e2a-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:11:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:30Z|00221|binding|INFO|Releasing lport a2b38e2a-6b92-4e68-ae24-ea094847d75b from this chassis (sb_readonly=0)
Oct  2 08:11:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:30Z|00222|binding|INFO|Setting lport a2b38e2a-6b92-4e68-ae24-ea094847d75b down in Southbound
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:30Z|00223|binding|INFO|Removing iface tapa2b38e2a-6b ovn-installed in OVS
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:30.431 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:21:3e 10.100.0.5'], port_security=['fa:16:3e:e9:21:3e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '02e1c250-b902-42fe-a5cf-af66aa02e2bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d845a33-56e0-4850-9f27-8a54095796f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef4e3be787374d90a6a236c7f76bd940', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3b3d6cff-45b6-4476-af05-0164bc00fd3b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4583e9be-3cfa-4470-9e2e-4e943d469605, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=a2b38e2a-6b92-4e68-ae24-ea094847d75b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:11:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:30.433 103246 INFO neutron.agent.ovn.metadata.agent [-] Port a2b38e2a-6b92-4e68-ae24-ea094847d75b in datapath 7d845a33-56e0-4850-9f27-8a54095796f2 unbound from our chassis#033[00m
Oct  2 08:11:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:30.434 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d845a33-56e0-4850-9f27-8a54095796f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:11:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:30.435 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[abdba7a3-94bb-43b7-8f9e-90bf26da85ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:30.435 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 namespace which is not needed anymore#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000034.scope: Deactivated successfully.
Oct  2 08:11:30 np0005466012 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000034.scope: Consumed 16.968s CPU time.
Oct  2 08:11:30 np0005466012 systemd-machined[152114]: Machine qemu-24-instance-00000034 terminated.
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.656 2 INFO nova.virt.libvirt.driver [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Instance destroyed successfully.#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.657 2 DEBUG nova.objects.instance [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lazy-loading 'resources' on Instance uuid 02e1c250-b902-42fe-a5cf-af66aa02e2bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.668 2 DEBUG nova.virt.libvirt.vif [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2087694336',display_name='tempest-tempest.common.compute-instance-2087694336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2087694336',id=52,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOYv65bqxpRWLwHd1cwzG/4qFq5fwAENkgbBIDBOqnc0JgdzkSWQfx96bY6oBwDuHqykokwPzxefRuwxgQXggptdfQb5jD77e031VNj4krJvTD/OQ1Uz/d20gy+DMsXFg==',key_name='tempest-keypair-922901776',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:10:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef4e3be787374d90a6a236c7f76bd940',ramdisk_id='',reservation_id='r-lc6yv2ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-812274278',owner_user_name='tempest-AttachInterfacesTestJSON-812274278-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:10:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fbc7616089cb4f78832692487019c83d',uuid=02e1c250-b902-42fe-a5cf-af66aa02e2bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.668 2 DEBUG nova.network.os_vif_util [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converting VIF {"id": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "address": "fa:16:3e:e9:21:3e", "network": {"id": "7d845a33-56e0-4850-9f27-8a54095796f2", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-581762823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef4e3be787374d90a6a236c7f76bd940", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2b38e2a-6b", "ovs_interfaceid": "a2b38e2a-6b92-4e68-ae24-ea094847d75b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.669 2 DEBUG nova.network.os_vif_util [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.669 2 DEBUG os_vif [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.671 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b38e2a-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.676 2 INFO os_vif [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:21:3e,bridge_name='br-int',has_traffic_filtering=True,id=a2b38e2a-6b92-4e68-ae24-ea094847d75b,network=Network(7d845a33-56e0-4850-9f27-8a54095796f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2b38e2a-6b')#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.676 2 INFO nova.virt.libvirt.driver [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Deleting instance files /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc_del#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.677 2 INFO nova.virt.libvirt.driver [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Deletion of /var/lib/nova/instances/02e1c250-b902-42fe-a5cf-af66aa02e2bc_del complete#033[00m
Oct  2 08:11:30 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [NOTICE]   (226983) : haproxy version is 2.8.14-c23fe91
Oct  2 08:11:30 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [NOTICE]   (226983) : path to executable is /usr/sbin/haproxy
Oct  2 08:11:30 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [WARNING]  (226983) : Exiting Master process...
Oct  2 08:11:30 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [WARNING]  (226983) : Exiting Master process...
Oct  2 08:11:30 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [ALERT]    (226983) : Current worker (226985) exited with code 143 (Terminated)
Oct  2 08:11:30 np0005466012 neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2[226979]: [WARNING]  (226983) : All workers exited. Exiting... (0)
Oct  2 08:11:30 np0005466012 systemd[1]: libpod-e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f.scope: Deactivated successfully.
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.751 2 INFO nova.compute.manager [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.751 2 DEBUG oslo.service.loopingcall [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.751 2 DEBUG nova.compute.manager [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:11:30 np0005466012 nova_compute[192063]: 2025-10-02 12:11:30.751 2 DEBUG nova.network.neutron [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:11:30 np0005466012 podman[227588]: 2025-10-02 12:11:30.753504259 +0000 UTC m=+0.234903163 container died e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f-userdata-shm.mount: Deactivated successfully.
Oct  2 08:11:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-f0f52dfe8d7f8aecede4c357c2240aca35bccc3ab5c814290bdc5b8e87df8ebc-merged.mount: Deactivated successfully.
Oct  2 08:11:30 np0005466012 podman[227588]: 2025-10-02 12:11:30.979514523 +0000 UTC m=+0.460913407 container cleanup e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:11:30 np0005466012 systemd[1]: libpod-conmon-e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f.scope: Deactivated successfully.
Oct  2 08:11:31 np0005466012 podman[227637]: 2025-10-02 12:11:31.137993299 +0000 UTC m=+0.131925632 container remove e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.146 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf50425-f439-4642-a811-be55f8402034]: (4, ('Thu Oct  2 12:11:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 (e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f)\ne714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f\nThu Oct  2 12:11:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 (e714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f)\ne714d50044769fe5bd92292831d2e3970405ab794afb7f44ff76d94d8762309f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.148 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cd330b96-bf02-42a9-a34d-ba18af2837c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.149 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d845a33-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:31 np0005466012 kernel: tap7d845a33-50: left promiscuous mode
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.168 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3132f0-b744-4c5a-989e-7d745e8fd497]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.184 2 DEBUG nova.compute.manager [req-6d6aeee3-b289-46b2-b1b7-6b5aa992fe3f req-c7fedd37-5edf-4e3c-a859-72b3cd3a4ef6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-unplugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.184 2 DEBUG oslo_concurrency.lockutils [req-6d6aeee3-b289-46b2-b1b7-6b5aa992fe3f req-c7fedd37-5edf-4e3c-a859-72b3cd3a4ef6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.184 2 DEBUG oslo_concurrency.lockutils [req-6d6aeee3-b289-46b2-b1b7-6b5aa992fe3f req-c7fedd37-5edf-4e3c-a859-72b3cd3a4ef6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.184 2 DEBUG oslo_concurrency.lockutils [req-6d6aeee3-b289-46b2-b1b7-6b5aa992fe3f req-c7fedd37-5edf-4e3c-a859-72b3cd3a4ef6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.185 2 DEBUG nova.compute.manager [req-6d6aeee3-b289-46b2-b1b7-6b5aa992fe3f req-c7fedd37-5edf-4e3c-a859-72b3cd3a4ef6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-unplugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.185 2 DEBUG nova.compute.manager [req-6d6aeee3-b289-46b2-b1b7-6b5aa992fe3f req-c7fedd37-5edf-4e3c-a859-72b3cd3a4ef6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-unplugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.196 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[604b1db7-fcb5-4ce8-8e23-e6f4f85ed6a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.197 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[59898ca2-d67b-4029-a514-6bb188c5e464]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.213 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb76d0f-fff4-48a0-8874-10fea632c2db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501825, 'reachable_time': 34303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227652, 'error': None, 'target': 'ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 systemd[1]: run-netns-ovnmeta\x2d7d845a33\x2d56e0\x2d4850\x2d9f27\x2d8a54095796f2.mount: Deactivated successfully.
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.217 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7d845a33-56e0-4850-9f27-8a54095796f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:11:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:11:31.217 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5fd305-1c2d-40be-9408-0aa7ed33a148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.676 2 DEBUG nova.network.neutron [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.703 2 INFO nova.compute.manager [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Took 0.95 seconds to deallocate network for instance.#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.785 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.785 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.880 2 DEBUG nova.compute.provider_tree [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.900 2 DEBUG nova.scheduler.client.report [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.925 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:31 np0005466012 nova_compute[192063]: 2025-10-02 12:11:31.962 2 INFO nova.scheduler.client.report [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Deleted allocations for instance 02e1c250-b902-42fe-a5cf-af66aa02e2bc#033[00m
Oct  2 08:11:32 np0005466012 nova_compute[192063]: 2025-10-02 12:11:32.036 2 DEBUG oslo_concurrency.lockutils [None req-2a87ab53-a74e-4b73-b6fd-ab2cc214165a fbc7616089cb4f78832692487019c83d ef4e3be787374d90a6a236c7f76bd940 - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.277 2 DEBUG nova.compute.manager [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.277 2 DEBUG oslo_concurrency.lockutils [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.277 2 DEBUG oslo_concurrency.lockutils [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.278 2 DEBUG oslo_concurrency.lockutils [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "02e1c250-b902-42fe-a5cf-af66aa02e2bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.278 2 DEBUG nova.compute.manager [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] No waiting events found dispatching network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.278 2 WARNING nova.compute.manager [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received unexpected event network-vif-plugged-a2b38e2a-6b92-4e68-ae24-ea094847d75b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:11:33 np0005466012 nova_compute[192063]: 2025-10-02 12:11:33.279 2 DEBUG nova.compute.manager [req-9be1a4eb-c8dc-4f4e-9797-1da2ff18afe0 req-f6bb6e75-f0cb-4a57-8075-c541ea72c59e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Received event network-vif-deleted-a2b38e2a-6b92-4e68-ae24-ea094847d75b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:34 np0005466012 nova_compute[192063]: 2025-10-02 12:11:34.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:35 np0005466012 podman[227653]: 2025-10-02 12:11:35.137747161 +0000 UTC m=+0.049380830 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:11:35 np0005466012 podman[227654]: 2025-10-02 12:11:35.16258347 +0000 UTC m=+0.074217599 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:11:35 np0005466012 nova_compute[192063]: 2025-10-02 12:11:35.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:35 np0005466012 nova_compute[192063]: 2025-10-02 12:11:35.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:37Z|00224|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:11:37 np0005466012 nova_compute[192063]: 2025-10-02 12:11:37.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:37 np0005466012 nova_compute[192063]: 2025-10-02 12:11:37.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:38 np0005466012 podman[227700]: 2025-10-02 12:11:38.129629275 +0000 UTC m=+0.051172619 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:11:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:40Z|00225|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:11:40 np0005466012 nova_compute[192063]: 2025-10-02 12:11:40.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:40 np0005466012 nova_compute[192063]: 2025-10-02 12:11:40.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:40 np0005466012 nova_compute[192063]: 2025-10-02 12:11:40.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:42 np0005466012 podman[227719]: 2025-10-02 12:11:42.152658678 +0000 UTC m=+0.070399992 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:11:43 np0005466012 nova_compute[192063]: 2025-10-02 12:11:43.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:45 np0005466012 nova_compute[192063]: 2025-10-02 12:11:45.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:45 np0005466012 nova_compute[192063]: 2025-10-02 12:11:45.654 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407090.653114, 02e1c250-b902-42fe-a5cf-af66aa02e2bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:11:45 np0005466012 nova_compute[192063]: 2025-10-02 12:11:45.654 2 INFO nova.compute.manager [-] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:11:45 np0005466012 nova_compute[192063]: 2025-10-02 12:11:45.675 2 DEBUG nova.compute.manager [None req-3a092a41-7efa-42db-9d7e-4543bc6baae8 - - - - - -] [instance: 02e1c250-b902-42fe-a5cf-af66aa02e2bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:11:45 np0005466012 nova_compute[192063]: 2025-10-02 12:11:45.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:46 np0005466012 nova_compute[192063]: 2025-10-02 12:11:46.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:46 np0005466012 nova_compute[192063]: 2025-10-02 12:11:46.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:48 np0005466012 podman[227740]: 2025-10-02 12:11:48.137739163 +0000 UTC m=+0.056230912 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:11:48 np0005466012 podman[227741]: 2025-10-02 12:11:48.14688732 +0000 UTC m=+0.063622766 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Oct  2 08:11:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:11:49Z|00226|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:11:49 np0005466012 nova_compute[192063]: 2025-10-02 12:11:49.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:50 np0005466012 nova_compute[192063]: 2025-10-02 12:11:50.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:50 np0005466012 nova_compute[192063]: 2025-10-02 12:11:50.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:50 np0005466012 nova_compute[192063]: 2025-10-02 12:11:50.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:52 np0005466012 podman[227779]: 2025-10-02 12:11:52.156902848 +0000 UTC m=+0.053644484 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:11:52 np0005466012 podman[227778]: 2025-10-02 12:11:52.158366722 +0000 UTC m=+0.064320867 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.762 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.763 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.776 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.882 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.883 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.888 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:11:55 np0005466012 nova_compute[192063]: 2025-10-02 12:11:55.889 2 INFO nova.compute.claims [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.025 2 DEBUG nova.compute.provider_tree [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.039 2 DEBUG nova.scheduler.client.report [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.059 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.060 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.128 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.128 2 DEBUG nova.network.neutron [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.155 2 INFO nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.175 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.428 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.429 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.430 2 INFO nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Creating image(s)#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.430 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "/var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.431 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "/var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.431 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "/var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.449 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.507 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.508 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.509 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.524 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.587 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.589 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.648 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.649 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.650 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.717 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.718 2 DEBUG nova.virt.disk.api [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Checking if we can resize image /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.719 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.786 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.788 2 DEBUG nova.virt.disk.api [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Cannot resize image /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.788 2 DEBUG nova.objects.instance [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'migration_context' on Instance uuid 07cc67f4-afea-47a6-95cb-909d3ce6eb63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.811 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.812 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Ensure instance console log exists: /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.813 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.814 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.814 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:11:56 np0005466012 nova_compute[192063]: 2025-10-02 12:11:56.886 2 DEBUG nova.policy [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcdfc3c0f94e42cb931d27f2e3b5b12d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dcf78460093d411988a54040ea4c265a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:11:58 np0005466012 nova_compute[192063]: 2025-10-02 12:11:58.583 2 DEBUG nova.network.neutron [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Successfully created port: fdd42839-b63e-4adb-b391-33f1c697beb4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.862 2 DEBUG nova.network.neutron [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Successfully updated port: fdd42839-b63e-4adb-b391-33f1c697beb4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.881 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "refresh_cache-07cc67f4-afea-47a6-95cb-909d3ce6eb63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.882 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquired lock "refresh_cache-07cc67f4-afea-47a6-95cb-909d3ce6eb63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.882 2 DEBUG nova.network.neutron [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.954 2 DEBUG nova.compute.manager [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Received event network-changed-fdd42839-b63e-4adb-b391-33f1c697beb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.955 2 DEBUG nova.compute.manager [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Refreshing instance network info cache due to event network-changed-fdd42839-b63e-4adb-b391-33f1c697beb4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:11:59 np0005466012 nova_compute[192063]: 2025-10-02 12:11:59.955 2 DEBUG oslo_concurrency.lockutils [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-07cc67f4-afea-47a6-95cb-909d3ce6eb63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:00 np0005466012 nova_compute[192063]: 2025-10-02 12:12:00.011 2 DEBUG nova.network.neutron [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:12:00 np0005466012 nova_compute[192063]: 2025-10-02 12:12:00.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:00 np0005466012 nova_compute[192063]: 2025-10-02 12:12:00.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.668 2 DEBUG nova.network.neutron [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Updating instance_info_cache with network_info: [{"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.685 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Releasing lock "refresh_cache-07cc67f4-afea-47a6-95cb-909d3ce6eb63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.685 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Instance network_info: |[{"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.685 2 DEBUG oslo_concurrency.lockutils [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-07cc67f4-afea-47a6-95cb-909d3ce6eb63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.686 2 DEBUG nova.network.neutron [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Refreshing network info cache for port fdd42839-b63e-4adb-b391-33f1c697beb4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.688 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Start _get_guest_xml network_info=[{"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.692 2 WARNING nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.697 2 DEBUG nova.virt.libvirt.host [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.698 2 DEBUG nova.virt.libvirt.host [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.702 2 DEBUG nova.virt.libvirt.host [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.703 2 DEBUG nova.virt.libvirt.host [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.704 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.704 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.705 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.705 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.705 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.705 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.705 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.706 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.706 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.706 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.706 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.707 2 DEBUG nova.virt.hardware [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.709 2 DEBUG nova.virt.libvirt.vif [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1966454935',display_name='tempest-ImagesTestJSON-server-1966454935',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1966454935',id=59,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dcf78460093d411988a54040ea4c265a',ramdisk_id='',reservation_id='r-yz8ex65z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-437970487',owner_user_name='tempest-ImagesTestJSON-437970487-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:56Z,user_data=None,user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',uuid=07cc67f4-afea-47a6-95cb-909d3ce6eb63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.710 2 DEBUG nova.network.os_vif_util [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converting VIF {"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.710 2 DEBUG nova.network.os_vif_util [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.711 2 DEBUG nova.objects.instance [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'pci_devices' on Instance uuid 07cc67f4-afea-47a6-95cb-909d3ce6eb63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.723 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <uuid>07cc67f4-afea-47a6-95cb-909d3ce6eb63</uuid>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <name>instance-0000003b</name>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:name>tempest-ImagesTestJSON-server-1966454935</nova:name>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:12:01</nova:creationTime>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:user uuid="dcdfc3c0f94e42cb931d27f2e3b5b12d">tempest-ImagesTestJSON-437970487-project-member</nova:user>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:project uuid="dcf78460093d411988a54040ea4c265a">tempest-ImagesTestJSON-437970487</nova:project>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        <nova:port uuid="fdd42839-b63e-4adb-b391-33f1c697beb4">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <entry name="serial">07cc67f4-afea-47a6-95cb-909d3ce6eb63</entry>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <entry name="uuid">07cc67f4-afea-47a6-95cb-909d3ce6eb63</entry>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.config"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:29:f6:14"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <target dev="tapfdd42839-b6"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/console.log" append="off"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:12:01 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:12:01 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:12:01 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:12:01 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.724 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Preparing to wait for external event network-vif-plugged-fdd42839-b63e-4adb-b391-33f1c697beb4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.725 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.725 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.725 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.725 2 DEBUG nova.virt.libvirt.vif [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:11:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1966454935',display_name='tempest-ImagesTestJSON-server-1966454935',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1966454935',id=59,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dcf78460093d411988a54040ea4c265a',ramdisk_id='',reservation_id='r-yz8ex65z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-437970487',owner_user_name='tempest-ImagesTestJSON-437970487-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:11:56Z,user_data=None,user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',uuid=07cc67f4-afea-47a6-95cb-909d3ce6eb63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.726 2 DEBUG nova.network.os_vif_util [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converting VIF {"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.726 2 DEBUG nova.network.os_vif_util [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.726 2 DEBUG os_vif [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.729 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfdd42839-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.730 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfdd42839-b6, col_values=(('external_ids', {'iface-id': 'fdd42839-b63e-4adb-b391-33f1c697beb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:f6:14', 'vm-uuid': '07cc67f4-afea-47a6-95cb-909d3ce6eb63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:01 np0005466012 NetworkManager[51207]: <info>  [1759407121.7318] manager: (tapfdd42839-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.740 2 INFO os_vif [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6')#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.800 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.800 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.800 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] No VIF found with MAC fa:16:3e:29:f6:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:12:01 np0005466012 nova_compute[192063]: 2025-10-02 12:12:01.800 2 INFO nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Using config drive#033[00m
Oct  2 08:12:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:02.122 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:02.123 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:02.123 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:02 np0005466012 nova_compute[192063]: 2025-10-02 12:12:02.851 2 INFO nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Creating config drive at /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.config#033[00m
Oct  2 08:12:02 np0005466012 nova_compute[192063]: 2025-10-02 12:12:02.859 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx4uj9hte execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:02 np0005466012 nova_compute[192063]: 2025-10-02 12:12:02.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.002 2 DEBUG oslo_concurrency.processutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx4uj9hte" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:03 np0005466012 kernel: tapfdd42839-b6: entered promiscuous mode
Oct  2 08:12:03 np0005466012 NetworkManager[51207]: <info>  [1759407123.0767] manager: (tapfdd42839-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Oct  2 08:12:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:03Z|00227|binding|INFO|Claiming lport fdd42839-b63e-4adb-b391-33f1c697beb4 for this chassis.
Oct  2 08:12:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:03Z|00228|binding|INFO|fdd42839-b63e-4adb-b391-33f1c697beb4: Claiming fa:16:3e:29:f6:14 10.100.0.12
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:03 np0005466012 systemd-udevd[227857]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:03Z|00229|binding|INFO|Setting lport fdd42839-b63e-4adb-b391-33f1c697beb4 ovn-installed in OVS
Oct  2 08:12:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:03Z|00230|binding|INFO|Setting lport fdd42839-b63e-4adb-b391-33f1c697beb4 up in Southbound
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.110 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:f6:14 10.100.0.12'], port_security=['fa:16:3e:29:f6:14 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '07cc67f4-afea-47a6-95cb-909d3ce6eb63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dcf78460093d411988a54040ea4c265a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aacce687-8b76-4e90-b19c-0dd006394188', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24ae9888-31f5-4083-b5ee-e7ed6a1eee13, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=fdd42839-b63e-4adb-b391-33f1c697beb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.113 103246 INFO neutron.agent.ovn.metadata.agent [-] Port fdd42839-b63e-4adb-b391-33f1c697beb4 in datapath 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 bound to our chassis#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.115 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:03 np0005466012 NetworkManager[51207]: <info>  [1759407123.1248] device (tapfdd42839-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:12:03 np0005466012 NetworkManager[51207]: <info>  [1759407123.1260] device (tapfdd42839-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:12:03 np0005466012 systemd-machined[152114]: New machine qemu-26-instance-0000003b.
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.130 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[413e14bf-5594-49eb-b85b-5a8451b92727]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.131 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f195445-f1 in ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.133 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f195445-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.133 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a428edb4-1751-432b-b5f4-6548efb6a203]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.134 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8d811acc-2db8-4f5a-94b4-0bc253f7250d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 systemd[1]: Started Virtual Machine qemu-26-instance-0000003b.
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.146 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a9ce6a-662c-47e8-8ae0-aa7f41f6f706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.172 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8466c0c3-4a30-47e3-9571-17ac8f6bf9c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.206 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb35838-cae1-4ffa-a1f2-f3dd68cf41d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 NetworkManager[51207]: <info>  [1759407123.2134] manager: (tap4f195445-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.214 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[374457bb-86e1-4164-bb40-27285c3f1206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 systemd-udevd[227861]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.250 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[de31e202-461f-425f-bf6b-275e3717e4b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.253 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7d49cea5-4a1b-445c-a75d-bf25544ff1ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 NetworkManager[51207]: <info>  [1759407123.2808] device (tap4f195445-f0): carrier: link connected
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.285 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[51e1ff95-0571-4541-88cb-75f3a4410498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.305 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[16f1a9b3-290a-449c-ad1d-71707668d9e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f195445-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511689, 'reachable_time': 23314, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227891, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.320 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4826fb95-0008-419b-8662-9e20c36b95d8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:9303'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511689, 'tstamp': 511689}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227892, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.339 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aefb0298-c481-477b-aa59-e29354274687]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f195445-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511689, 'reachable_time': 23314, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227893, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.354 2 DEBUG nova.compute.manager [req-efc6dd92-9038-4772-be0a-a8692490d772 req-63169a5a-4792-47ff-ab70-e5130b198777 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Received event network-vif-plugged-fdd42839-b63e-4adb-b391-33f1c697beb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.354 2 DEBUG oslo_concurrency.lockutils [req-efc6dd92-9038-4772-be0a-a8692490d772 req-63169a5a-4792-47ff-ab70-e5130b198777 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.355 2 DEBUG oslo_concurrency.lockutils [req-efc6dd92-9038-4772-be0a-a8692490d772 req-63169a5a-4792-47ff-ab70-e5130b198777 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.355 2 DEBUG oslo_concurrency.lockutils [req-efc6dd92-9038-4772-be0a-a8692490d772 req-63169a5a-4792-47ff-ab70-e5130b198777 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.355 2 DEBUG nova.compute.manager [req-efc6dd92-9038-4772-be0a-a8692490d772 req-63169a5a-4792-47ff-ab70-e5130b198777 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Processing event network-vif-plugged-fdd42839-b63e-4adb-b391-33f1c697beb4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.368 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[818e7faa-b5d2-4e0a-8723-cc19231f176b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.429 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1cf3ca82-fb8c-4b4a-92b8-1b0684f2684b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.431 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f195445-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.431 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.432 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f195445-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:03 np0005466012 NetworkManager[51207]: <info>  [1759407123.4345] manager: (tap4f195445-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Oct  2 08:12:03 np0005466012 kernel: tap4f195445-f0: entered promiscuous mode
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.437 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f195445-f0, col_values=(('external_ids', {'iface-id': 'd65a1bd0-87e2-4bbf-9945-dacace78444f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:03Z|00231|binding|INFO|Releasing lport d65a1bd0-87e2-4bbf-9945-dacace78444f from this chassis (sb_readonly=0)
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.439 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.440 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4b0132-1a4a-4358-98d5-655599b8e36c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.441 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.pid.haproxy
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:12:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:03.441 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'env', 'PROCESS_TAG=haproxy-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.732 2 DEBUG nova.network.neutron [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Updated VIF entry in instance network info cache for port fdd42839-b63e-4adb-b391-33f1c697beb4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.732 2 DEBUG nova.network.neutron [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Updating instance_info_cache with network_info: [{"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:03 np0005466012 nova_compute[192063]: 2025-10-02 12:12:03.752 2 DEBUG oslo_concurrency.lockutils [req-478ee663-3d3e-4c53-923f-56dd71245ca9 req-7b283f28-3de8-49d1-ba09-f37d270c2f75 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-07cc67f4-afea-47a6-95cb-909d3ce6eb63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:03 np0005466012 podman[227926]: 2025-10-02 12:12:03.762740358 +0000 UTC m=+0.022743780 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:12:03 np0005466012 podman[227926]: 2025-10-02 12:12:03.963938335 +0000 UTC m=+0.223941757 container create 4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:12:04 np0005466012 systemd[1]: Started libpod-conmon-4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738.scope.
Oct  2 08:12:04 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:12:04 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa3656f750c783179c2146a528d3645da2ee0f6769da25096abaaf25ab64a14c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:12:04 np0005466012 podman[227926]: 2025-10-02 12:12:04.059175856 +0000 UTC m=+0.319179298 container init 4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:12:04 np0005466012 podman[227926]: 2025-10-02 12:12:04.064797487 +0000 UTC m=+0.324800909 container start 4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:12:04 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [NOTICE]   (227951) : New worker (227953) forked
Oct  2 08:12:04 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [NOTICE]   (227951) : Loading success.
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.253 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.253 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407124.2527258, 07cc67f4-afea-47a6-95cb-909d3ce6eb63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.254 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] VM Started (Lifecycle Event)#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.256 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.260 2 INFO nova.virt.libvirt.driver [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Instance spawned successfully.#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.260 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.286 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.288 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.294 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.295 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.295 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.295 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.296 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.296 2 DEBUG nova.virt.libvirt.driver [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.324 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.324 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407124.252831, 07cc67f4-afea-47a6-95cb-909d3ce6eb63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.324 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.371 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.378 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407124.2556155, 07cc67f4-afea-47a6-95cb-909d3ce6eb63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.378 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.406 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.409 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.423 2 INFO nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Took 7.99 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.424 2 DEBUG nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.436 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.550 2 INFO nova.compute.manager [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Took 8.71 seconds to build instance.#033[00m
Oct  2 08:12:04 np0005466012 nova_compute[192063]: 2025-10-02 12:12:04.572 2 DEBUG oslo_concurrency.lockutils [None req-1214d146-08b9-4ed0-92c2-9515daccd897 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.460 2 DEBUG nova.compute.manager [req-d234c798-ed16-49a4-995b-706490750da0 req-244f57dc-74f3-4927-82bd-83bc38929419 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Received event network-vif-plugged-fdd42839-b63e-4adb-b391-33f1c697beb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.461 2 DEBUG oslo_concurrency.lockutils [req-d234c798-ed16-49a4-995b-706490750da0 req-244f57dc-74f3-4927-82bd-83bc38929419 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.461 2 DEBUG oslo_concurrency.lockutils [req-d234c798-ed16-49a4-995b-706490750da0 req-244f57dc-74f3-4927-82bd-83bc38929419 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.461 2 DEBUG oslo_concurrency.lockutils [req-d234c798-ed16-49a4-995b-706490750da0 req-244f57dc-74f3-4927-82bd-83bc38929419 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.462 2 DEBUG nova.compute.manager [req-d234c798-ed16-49a4-995b-706490750da0 req-244f57dc-74f3-4927-82bd-83bc38929419 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] No waiting events found dispatching network-vif-plugged-fdd42839-b63e-4adb-b391-33f1c697beb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:05 np0005466012 nova_compute[192063]: 2025-10-02 12:12:05.462 2 WARNING nova.compute.manager [req-d234c798-ed16-49a4-995b-706490750da0 req-244f57dc-74f3-4927-82bd-83bc38929419 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Received unexpected event network-vif-plugged-fdd42839-b63e-4adb-b391-33f1c697beb4 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:12:06 np0005466012 podman[227969]: 2025-10-02 12:12:06.212414016 +0000 UTC m=+0.114749153 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:12:06 np0005466012 podman[227970]: 2025-10-02 12:12:06.229278206 +0000 UTC m=+0.138161441 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:12:06 np0005466012 nova_compute[192063]: 2025-10-02 12:12:06.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.070 2 INFO nova.compute.manager [None req-518e171a-78f1-4bd6-9daf-4ba92d62d1f8 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Pausing#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.071 2 DEBUG nova.objects.instance [None req-518e171a-78f1-4bd6-9daf-4ba92d62d1f8 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'flavor' on Instance uuid 07cc67f4-afea-47a6-95cb-909d3ce6eb63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.107 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407127.106973, 07cc67f4-afea-47a6-95cb-909d3ce6eb63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.107 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.109 2 DEBUG nova.compute.manager [None req-518e171a-78f1-4bd6-9daf-4ba92d62d1f8 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.126 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:07 np0005466012 nova_compute[192063]: 2025-10-02 12:12:07.129 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:08 np0005466012 nova_compute[192063]: 2025-10-02 12:12:08.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:09 np0005466012 podman[228018]: 2025-10-02 12:12:09.160488824 +0000 UTC m=+0.061468551 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:12:10 np0005466012 nova_compute[192063]: 2025-10-02 12:12:10.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:10 np0005466012 nova_compute[192063]: 2025-10-02 12:12:10.529 2 DEBUG nova.compute.manager [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:10 np0005466012 nova_compute[192063]: 2025-10-02 12:12:10.617 2 INFO nova.compute.manager [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] instance snapshotting#033[00m
Oct  2 08:12:10 np0005466012 nova_compute[192063]: 2025-10-02 12:12:10.618 2 WARNING nova.compute.manager [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Oct  2 08:12:10 np0005466012 nova_compute[192063]: 2025-10-02 12:12:10.939 2 INFO nova.virt.libvirt.driver [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Beginning live snapshot process#033[00m
Oct  2 08:12:11 np0005466012 virtqemud[191783]: invalid argument: disk vda does not have an active block job
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.145 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.225 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json -f qcow2" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.227 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.303 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json -f qcow2" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.317 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.376 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.377 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpf2_r54r7/3a7b5717a3a84f379fc7cfb0314af4c9.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.973 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpf2_r54r7/3a7b5717a3a84f379fc7cfb0314af4c9.delta 1073741824" returned: 0 in 0.596s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:11 np0005466012 nova_compute[192063]: 2025-10-02 12:12:11.974 2 INFO nova.virt.libvirt.driver [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:12:12 np0005466012 nova_compute[192063]: 2025-10-02 12:12:12.083 2 DEBUG nova.virt.libvirt.guest [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:12:12 np0005466012 nova_compute[192063]: 2025-10-02 12:12:12.589 2 DEBUG nova.virt.libvirt.guest [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:12:12 np0005466012 nova_compute[192063]: 2025-10-02 12:12:12.592 2 INFO nova.virt.libvirt.driver [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:12:12 np0005466012 podman[228056]: 2025-10-02 12:12:12.784670229 +0000 UTC m=+0.067265817 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:12:12 np0005466012 nova_compute[192063]: 2025-10-02 12:12:12.928 2 DEBUG nova.privsep.utils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:12:12 np0005466012 nova_compute[192063]: 2025-10-02 12:12:12.929 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpf2_r54r7/3a7b5717a3a84f379fc7cfb0314af4c9.delta /var/lib/nova/instances/snapshots/tmpf2_r54r7/3a7b5717a3a84f379fc7cfb0314af4c9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:13 np0005466012 nova_compute[192063]: 2025-10-02 12:12:13.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:14 np0005466012 nova_compute[192063]: 2025-10-02 12:12:14.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:14 np0005466012 nova_compute[192063]: 2025-10-02 12:12:14.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:14 np0005466012 nova_compute[192063]: 2025-10-02 12:12:14.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:15 np0005466012 nova_compute[192063]: 2025-10-02 12:12:15.014 2 DEBUG oslo_concurrency.processutils [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpf2_r54r7/3a7b5717a3a84f379fc7cfb0314af4c9.delta /var/lib/nova/instances/snapshots/tmpf2_r54r7/3a7b5717a3a84f379fc7cfb0314af4c9" returned: 0 in 2.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:15 np0005466012 nova_compute[192063]: 2025-10-02 12:12:15.016 2 INFO nova.virt.libvirt.driver [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:12:15 np0005466012 nova_compute[192063]: 2025-10-02 12:12:15.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:16 np0005466012 nova_compute[192063]: 2025-10-02 12:12:16.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:17 np0005466012 nova_compute[192063]: 2025-10-02 12:12:17.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:17 np0005466012 nova_compute[192063]: 2025-10-02 12:12:17.665 2 INFO nova.virt.libvirt.driver [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Snapshot image upload complete#033[00m
Oct  2 08:12:17 np0005466012 nova_compute[192063]: 2025-10-02 12:12:17.665 2 INFO nova.compute.manager [None req-a9542671-3480-4eb1-b789-ddd3bb8a40da dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Took 7.04 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:12:17 np0005466012 nova_compute[192063]: 2025-10-02 12:12:17.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:17 np0005466012 nova_compute[192063]: 2025-10-02 12:12:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:17 np0005466012 nova_compute[192063]: 2025-10-02 12:12:17.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.854 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.856 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.924 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.977 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.978 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.978 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.979 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.979 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.990 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:18 np0005466012 nova_compute[192063]: 2025-10-02 12:12:18.991 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.011 2 INFO nova.compute.manager [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Terminating instance#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.029 2 DEBUG nova.compute.manager [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:12:19 np0005466012 kernel: tapfdd42839-b6 (unregistering): left promiscuous mode
Oct  2 08:12:19 np0005466012 NetworkManager[51207]: <info>  [1759407139.0490] device (tapfdd42839-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.053 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:19Z|00232|binding|INFO|Releasing lport fdd42839-b63e-4adb-b391-33f1c697beb4 from this chassis (sb_readonly=0)
Oct  2 08:12:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:19Z|00233|binding|INFO|Setting lport fdd42839-b63e-4adb-b391-33f1c697beb4 down in Southbound
Oct  2 08:12:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:19Z|00234|binding|INFO|Removing iface tapfdd42839-b6 ovn-installed in OVS
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.063 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.076 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:f6:14 10.100.0.12'], port_security=['fa:16:3e:29:f6:14 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '07cc67f4-afea-47a6-95cb-909d3ce6eb63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dcf78460093d411988a54040ea4c265a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aacce687-8b76-4e90-b19c-0dd006394188', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24ae9888-31f5-4083-b5ee-e7ed6a1eee13, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=fdd42839-b63e-4adb-b391-33f1c697beb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.080 103246 INFO neutron.agent.ovn.metadata.agent [-] Port fdd42839-b63e-4adb-b391-33f1c697beb4 in datapath 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 unbound from our chassis#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.081 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.082 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c525ef-807d-417e-9cd9-9ff5978eea03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.083 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 namespace which is not needed anymore#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Oct  2 08:12:19 np0005466012 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003b.scope: Consumed 3.945s CPU time.
Oct  2 08:12:19 np0005466012 systemd-machined[152114]: Machine qemu-26-instance-0000003b terminated.
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.137 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.138 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:19 np0005466012 podman[228092]: 2025-10-02 12:12:19.147660729 +0000 UTC m=+0.069676249 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Oct  2 08:12:19 np0005466012 podman[228087]: 2025-10-02 12:12:19.162602851 +0000 UTC m=+0.081132466 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.202 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:19 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [NOTICE]   (227951) : haproxy version is 2.8.14-c23fe91
Oct  2 08:12:19 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [NOTICE]   (227951) : path to executable is /usr/sbin/haproxy
Oct  2 08:12:19 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [WARNING]  (227951) : Exiting Master process...
Oct  2 08:12:19 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [ALERT]    (227951) : Current worker (227953) exited with code 143 (Terminated)
Oct  2 08:12:19 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[227947]: [WARNING]  (227951) : All workers exited. Exiting... (0)
Oct  2 08:12:19 np0005466012 systemd[1]: libpod-4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738.scope: Deactivated successfully.
Oct  2 08:12:19 np0005466012 conmon[227947]: conmon 4ed60f1b9a2c23c5f9bb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738.scope/container/memory.events
Oct  2 08:12:19 np0005466012 podman[228152]: 2025-10-02 12:12:19.245626033 +0000 UTC m=+0.064475112 container died 4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.297 2 INFO nova.virt.libvirt.driver [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Instance destroyed successfully.#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.298 2 DEBUG nova.objects.instance [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'resources' on Instance uuid 07cc67f4-afea-47a6-95cb-909d3ce6eb63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.315 2 DEBUG nova.virt.libvirt.vif [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:11:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1966454935',display_name='tempest-ImagesTestJSON-server-1966454935',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1966454935',id=59,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:12:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='dcf78460093d411988a54040ea4c265a',ramdisk_id='',reservation_id='r-yz8ex65z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-437970487',owner_user_name='tempest-ImagesTestJSON-437970487-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:12:17Z,user_data=None,user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',uuid=07cc67f4-afea-47a6-95cb-909d3ce6eb63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.315 2 DEBUG nova.network.os_vif_util [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converting VIF {"id": "fdd42839-b63e-4adb-b391-33f1c697beb4", "address": "fa:16:3e:29:f6:14", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdd42839-b6", "ovs_interfaceid": "fdd42839-b63e-4adb-b391-33f1c697beb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.316 2 DEBUG nova.network.os_vif_util [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.317 2 DEBUG os_vif [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.320 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdd42839-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.326 2 INFO os_vif [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f6:14,bridge_name='br-int',has_traffic_filtering=True,id=fdd42839-b63e-4adb-b391-33f1c697beb4,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdd42839-b6')#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.326 2 INFO nova.virt.libvirt.driver [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Deleting instance files /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63_del#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.327 2 INFO nova.virt.libvirt.driver [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Deletion of /var/lib/nova/instances/07cc67f4-afea-47a6-95cb-909d3ce6eb63_del complete#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.413 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.414 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5445MB free_disk=73.39884948730469GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.414 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.414 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.459 2 INFO nova.compute.manager [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.460 2 DEBUG oslo.service.loopingcall [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.460 2 DEBUG nova.compute.manager [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.460 2 DEBUG nova.network.neutron [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.497 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 0753ad57-d509-4a98-bba1-e9b29c087474 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.498 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 07cc67f4-afea-47a6-95cb-909d3ce6eb63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.498 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.498 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:12:19 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738-userdata-shm.mount: Deactivated successfully.
Oct  2 08:12:19 np0005466012 systemd[1]: var-lib-containers-storage-overlay-fa3656f750c783179c2146a528d3645da2ee0f6769da25096abaaf25ab64a14c-merged.mount: Deactivated successfully.
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.571 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:19 np0005466012 podman[228152]: 2025-10-02 12:12:19.590484468 +0000 UTC m=+0.409333547 container cleanup 4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.592 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.616 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.616 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:19 np0005466012 systemd[1]: libpod-conmon-4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738.scope: Deactivated successfully.
Oct  2 08:12:19 np0005466012 podman[228201]: 2025-10-02 12:12:19.864673373 +0000 UTC m=+0.229775243 container remove 4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.871 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8133d5f1-f735-4fff-a862-1c9cf2efc5e6]: (4, ('Thu Oct  2 12:12:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 (4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738)\n4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738\nThu Oct  2 12:12:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 (4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738)\n4ed60f1b9a2c23c5f9bbfc993451fc4f447ee378b8e07630098e3189dae0b738\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.872 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9e173a2b-b4cf-426b-8a29-6cb52aa32ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.873 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f195445-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 kernel: tap4f195445-f0: left promiscuous mode
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.880 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b215bf25-9653-409e-828a-b98cffcab643]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 nova_compute[192063]: 2025-10-02 12:12:19.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.907 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e49379-a5ed-460e-b343-ada1cc67c85f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.909 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2c8499-d52e-457e-8a54-cf988f28930a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.924 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca45523-88f9-4b7b-ac00-167ab41a48fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511681, 'reachable_time': 24999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228216, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.928 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:12:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:19.928 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a8d3f7-8d59-4ae8-9886-4710e35e5d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:19 np0005466012 systemd[1]: run-netns-ovnmeta\x2d4f195445\x2dfd43\x2d4b92\x2d89dd\x2da1b2fe9ea8c2.mount: Deactivated successfully.
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:20.388 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:20.390 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.417 2 DEBUG nova.network.neutron [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.536 2 DEBUG nova.compute.manager [req-e77437a0-14bf-4a64-ae82-b506bf75f066 req-aee4f20b-b432-4fb0-b8c3-b3e8482baa0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Received event network-vif-deleted-fdd42839-b63e-4adb-b391-33f1c697beb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.536 2 INFO nova.compute.manager [req-e77437a0-14bf-4a64-ae82-b506bf75f066 req-aee4f20b-b432-4fb0-b8c3-b3e8482baa0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Neutron deleted interface fdd42839-b63e-4adb-b391-33f1c697beb4; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.536 2 DEBUG nova.network.neutron [req-e77437a0-14bf-4a64-ae82-b506bf75f066 req-aee4f20b-b432-4fb0-b8c3-b3e8482baa0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.566 2 DEBUG nova.compute.manager [req-e77437a0-14bf-4a64-ae82-b506bf75f066 req-aee4f20b-b432-4fb0-b8c3-b3e8482baa0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Detach interface failed, port_id=fdd42839-b63e-4adb-b391-33f1c697beb4, reason: Instance 07cc67f4-afea-47a6-95cb-909d3ce6eb63 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.567 2 INFO nova.compute.manager [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Took 1.11 seconds to deallocate network for instance.#033[00m
Oct  2 08:12:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:20Z|00235|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.616 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.657 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.657 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.731 2 DEBUG nova.compute.provider_tree [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.745 2 DEBUG nova.scheduler.client.report [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.768 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.790 2 INFO nova.scheduler.client.report [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Deleted allocations for instance 07cc67f4-afea-47a6-95cb-909d3ce6eb63#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:12:20 np0005466012 nova_compute[192063]: 2025-10-02 12:12:20.871 2 DEBUG oslo_concurrency.lockutils [None req-c8aebb71-2325-4b59-bc69-1d449295a875 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "07cc67f4-afea-47a6-95cb-909d3ce6eb63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.893s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:23 np0005466012 podman[228217]: 2025-10-02 12:12:23.150940553 +0000 UTC m=+0.061630475 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:12:23 np0005466012 podman[228218]: 2025-10-02 12:12:23.199278137 +0000 UTC m=+0.095167081 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:12:24 np0005466012 nova_compute[192063]: 2025-10-02 12:12:24.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:24 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:24Z|00236|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:12:24 np0005466012 nova_compute[192063]: 2025-10-02 12:12:24.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:25 np0005466012 nova_compute[192063]: 2025-10-02 12:12:25.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:25 np0005466012 nova_compute[192063]: 2025-10-02 12:12:25.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:12:25 np0005466012 nova_compute[192063]: 2025-10-02 12:12:25.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:12:26 np0005466012 nova_compute[192063]: 2025-10-02 12:12:26.127 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:12:26 np0005466012 nova_compute[192063]: 2025-10-02 12:12:26.128 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:12:26 np0005466012 nova_compute[192063]: 2025-10-02 12:12:26.128 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:12:28 np0005466012 nova_compute[192063]: 2025-10-02 12:12:28.639 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updating instance_info_cache with network_info: [{"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:28 np0005466012 nova_compute[192063]: 2025-10-02 12:12:28.662 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-0753ad57-d509-4a98-bba1-e9b29c087474" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:12:28 np0005466012 nova_compute[192063]: 2025-10-02 12:12:28.663 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:12:29 np0005466012 nova_compute[192063]: 2025-10-02 12:12:29.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:30.392 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:30 np0005466012 nova_compute[192063]: 2025-10-02 12:12:30.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:33 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:33Z|00237|binding|INFO|Releasing lport 1e2d82b4-a363-4c19-94d1-e62c1ba8e34a from this chassis (sb_readonly=0)
Oct  2 08:12:33 np0005466012 nova_compute[192063]: 2025-10-02 12:12:33.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:34 np0005466012 nova_compute[192063]: 2025-10-02 12:12:34.297 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407139.2955968, 07cc67f4-afea-47a6-95cb-909d3ce6eb63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:34 np0005466012 nova_compute[192063]: 2025-10-02 12:12:34.298 2 INFO nova.compute.manager [-] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:12:34 np0005466012 nova_compute[192063]: 2025-10-02 12:12:34.321 2 DEBUG nova.compute.manager [None req-f2d66484-ee6f-4c04-b185-c46e01fb0d9e - - - - - -] [instance: 07cc67f4-afea-47a6-95cb-909d3ce6eb63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:34 np0005466012 nova_compute[192063]: 2025-10-02 12:12:34.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:35 np0005466012 nova_compute[192063]: 2025-10-02 12:12:35.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:37 np0005466012 podman[228260]: 2025-10-02 12:12:37.135437792 +0000 UTC m=+0.056786688 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:12:37 np0005466012 podman[228261]: 2025-10-02 12:12:37.180537528 +0000 UTC m=+0.098771420 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Oct  2 08:12:39 np0005466012 nova_compute[192063]: 2025-10-02 12:12:39.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:40 np0005466012 podman[228314]: 2025-10-02 12:12:40.151652493 +0000 UTC m=+0.067446622 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:12:40 np0005466012 nova_compute[192063]: 2025-10-02 12:12:40.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:42 np0005466012 nova_compute[192063]: 2025-10-02 12:12:42.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:42 np0005466012 nova_compute[192063]: 2025-10-02 12:12:42.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 podman[228334]: 2025-10-02 12:12:43.158893662 +0000 UTC m=+0.070046580 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.237 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.237 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.237 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.237 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.238 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.248 2 INFO nova.compute.manager [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Terminating instance#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.261 2 DEBUG nova.compute.manager [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:12:43 np0005466012 kernel: tap7332d0b6-e5 (unregistering): left promiscuous mode
Oct  2 08:12:43 np0005466012 NetworkManager[51207]: <info>  [1759407163.3082] device (tap7332d0b6-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:12:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:43Z|00238|binding|INFO|Releasing lport 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 from this chassis (sb_readonly=0)
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:43Z|00239|binding|INFO|Setting lport 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 down in Southbound
Oct  2 08:12:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:12:43Z|00240|binding|INFO|Removing iface tap7332d0b6-e5 ovn-installed in OVS
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.331 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:69:e3 10.100.0.12'], port_security=['fa:16:3e:1b:69:e3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0753ad57-d509-4a98-bba1-e9b29c087474', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20417475a6a149d5bc47976f4da9a4ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5535fb48-d673-47c4-b26e-f6f2718957b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8a937e8-285b-47d1-b87a-47c75465be5a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7332d0b6-e5f0-41e2-aa18-69453b2d2b21) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.332 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7332d0b6-e5f0-41e2-aa18-69453b2d2b21 in datapath 2bdfd186-139e-456a-92e9-4dc9c37a846a unbound from our chassis#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.333 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bdfd186-139e-456a-92e9-4dc9c37a846a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.334 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ecec2d11-0a1a-453a-8115-42dbd5b9b834]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.335 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a namespace which is not needed anymore#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000038.scope: Deactivated successfully.
Oct  2 08:12:43 np0005466012 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000038.scope: Consumed 16.763s CPU time.
Oct  2 08:12:43 np0005466012 systemd-machined[152114]: Machine qemu-25-instance-00000038 terminated.
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [NOTICE]   (227371) : haproxy version is 2.8.14-c23fe91
Oct  2 08:12:43 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [NOTICE]   (227371) : path to executable is /usr/sbin/haproxy
Oct  2 08:12:43 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [WARNING]  (227371) : Exiting Master process...
Oct  2 08:12:43 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [ALERT]    (227371) : Current worker (227373) exited with code 143 (Terminated)
Oct  2 08:12:43 np0005466012 neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a[227367]: [WARNING]  (227371) : All workers exited. Exiting... (0)
Oct  2 08:12:43 np0005466012 systemd[1]: libpod-b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae.scope: Deactivated successfully.
Oct  2 08:12:43 np0005466012 podman[228378]: 2025-10-02 12:12:43.531349721 +0000 UTC m=+0.100811591 container died b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.534 2 INFO nova.virt.libvirt.driver [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Instance destroyed successfully.#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.534 2 DEBUG nova.objects.instance [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lazy-loading 'resources' on Instance uuid 0753ad57-d509-4a98-bba1-e9b29c087474 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.550 2 DEBUG nova.virt.libvirt.vif [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1120877738',display_name='tempest-ServerActionsTestOtherA-server-1120877738',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1120877738',id=56,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxV3/3UVj3iLiv8GHkV/x6VYyYPVFG5yThfPAWdnRtPLxRt5nk8D+Dtcmc6m48b1gfoKmcnooDopojNsfnOakPU7WA24nbcaEk0vNw9hR38BD9zJ2a+hy7fQOi0lwh9QA==',key_name='tempest-keypair-1208414249',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:11:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20417475a6a149d5bc47976f4da9a4ae',ramdisk_id='',reservation_id='r-ooa2t6mh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-352727288',owner_user_name='tempest-ServerActionsTestOtherA-352727288-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:11:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c2b9eab3da414692b3942505e3441920',uuid=0753ad57-d509-4a98-bba1-e9b29c087474,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.551 2 DEBUG nova.network.os_vif_util [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converting VIF {"id": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "address": "fa:16:3e:1b:69:e3", "network": {"id": "2bdfd186-139e-456a-92e9-4dc9c37a846a", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-953736127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20417475a6a149d5bc47976f4da9a4ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7332d0b6-e5", "ovs_interfaceid": "7332d0b6-e5f0-41e2-aa18-69453b2d2b21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.551 2 DEBUG nova.network.os_vif_util [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.552 2 DEBUG os_vif [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.553 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7332d0b6-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.559 2 INFO os_vif [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:69:e3,bridge_name='br-int',has_traffic_filtering=True,id=7332d0b6-e5f0-41e2-aa18-69453b2d2b21,network=Network(2bdfd186-139e-456a-92e9-4dc9c37a846a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7332d0b6-e5')#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.561 2 INFO nova.virt.libvirt.driver [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Deleting instance files /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474_del#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.561 2 INFO nova.virt.libvirt.driver [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Deletion of /var/lib/nova/instances/0753ad57-d509-4a98-bba1-e9b29c087474_del complete#033[00m
Oct  2 08:12:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae-userdata-shm.mount: Deactivated successfully.
Oct  2 08:12:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay-7b5655323f2097bda224bea88ef8f023634f3733c5c2d2c668210e03f13adc7c-merged.mount: Deactivated successfully.
Oct  2 08:12:43 np0005466012 podman[228378]: 2025-10-02 12:12:43.626964884 +0000 UTC m=+0.196426754 container cleanup b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:12:43 np0005466012 systemd[1]: libpod-conmon-b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae.scope: Deactivated successfully.
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.642 2 INFO nova.compute.manager [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.643 2 DEBUG oslo.service.loopingcall [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.644 2 DEBUG nova.compute.manager [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.644 2 DEBUG nova.network.neutron [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.691 2 DEBUG nova.compute.manager [req-b76b934a-9868-44e8-b9af-f93c7b246759 req-45b667a5-1e63-4b51-880b-328e58754b74 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-vif-unplugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.691 2 DEBUG oslo_concurrency.lockutils [req-b76b934a-9868-44e8-b9af-f93c7b246759 req-45b667a5-1e63-4b51-880b-328e58754b74 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.692 2 DEBUG oslo_concurrency.lockutils [req-b76b934a-9868-44e8-b9af-f93c7b246759 req-45b667a5-1e63-4b51-880b-328e58754b74 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.693 2 DEBUG oslo_concurrency.lockutils [req-b76b934a-9868-44e8-b9af-f93c7b246759 req-45b667a5-1e63-4b51-880b-328e58754b74 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.693 2 DEBUG nova.compute.manager [req-b76b934a-9868-44e8-b9af-f93c7b246759 req-45b667a5-1e63-4b51-880b-328e58754b74 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] No waiting events found dispatching network-vif-unplugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.694 2 DEBUG nova.compute.manager [req-b76b934a-9868-44e8-b9af-f93c7b246759 req-45b667a5-1e63-4b51-880b-328e58754b74 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-vif-unplugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:12:43 np0005466012 podman[228427]: 2025-10-02 12:12:43.821418997 +0000 UTC m=+0.174066917 container remove b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.829 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[83cdf489-937f-4922-a022-63ff72fb81a8]: (4, ('Thu Oct  2 12:12:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a (b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae)\nb8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae\nThu Oct  2 12:12:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a (b8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae)\nb8f844cbdb5411e64439762f6c7472fcf4b1c6356d49af63c4475f1880568dae\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.830 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1889e5-5bba-4c49-91a7-3ac433a6089b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.831 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bdfd186-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 kernel: tap2bdfd186-10: left promiscuous mode
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.837 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5898f514-a751-43e2-a6a7-917efbbce488]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 nova_compute[192063]: 2025-10-02 12:12:43.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.868 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7b1250-4d31-4125-8f20-406be7b46956]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.869 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e2482156-034c-4856-ba8b-3af2ca27dfd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.888 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac295b6-adc7-4336-8b2c-02bb92cd0435]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505884, 'reachable_time': 19204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228442, 'error': None, 'target': 'ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.890 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2bdfd186-139e-456a-92e9-4dc9c37a846a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:12:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:12:43.890 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3640ab-d312-4286-a3c7-9ab3d2ce538b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:12:43 np0005466012 systemd[1]: run-netns-ovnmeta\x2d2bdfd186\x2d139e\x2d456a\x2d92e9\x2d4dc9c37a846a.mount: Deactivated successfully.
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.421 2 DEBUG nova.network.neutron [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.453 2 INFO nova.compute.manager [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Took 0.81 seconds to deallocate network for instance.#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.516 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.516 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.566 2 DEBUG nova.compute.provider_tree [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.578 2 DEBUG nova.scheduler.client.report [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.596 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.626 2 INFO nova.scheduler.client.report [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Deleted allocations for instance 0753ad57-d509-4a98-bba1-e9b29c087474#033[00m
Oct  2 08:12:44 np0005466012 nova_compute[192063]: 2025-10-02 12:12:44.725 2 DEBUG oslo_concurrency.lockutils [None req-405ac5a0-f024-4493-9f6e-c60d73b36279 c2b9eab3da414692b3942505e3441920 20417475a6a149d5bc47976f4da9a4ae - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.807 2 DEBUG nova.compute.manager [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.807 2 DEBUG oslo_concurrency.lockutils [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.808 2 DEBUG oslo_concurrency.lockutils [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.808 2 DEBUG oslo_concurrency.lockutils [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "0753ad57-d509-4a98-bba1-e9b29c087474-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.808 2 DEBUG nova.compute.manager [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] No waiting events found dispatching network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.809 2 WARNING nova.compute.manager [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received unexpected event network-vif-plugged-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:12:45 np0005466012 nova_compute[192063]: 2025-10-02 12:12:45.809 2 DEBUG nova.compute.manager [req-fe06eac5-e6ee-4a9e-a2e4-03685c53ba29 req-8c0e56e8-5453-42fd-bb08-9876e71ab278 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Received event network-vif-deleted-7332d0b6-e5f0-41e2-aa18-69453b2d2b21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:12:46 np0005466012 nova_compute[192063]: 2025-10-02 12:12:46.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:47 np0005466012 nova_compute[192063]: 2025-10-02 12:12:47.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:48 np0005466012 nova_compute[192063]: 2025-10-02 12:12:48.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:48 np0005466012 nova_compute[192063]: 2025-10-02 12:12:48.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:48 np0005466012 nova_compute[192063]: 2025-10-02 12:12:48.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:50 np0005466012 podman[228450]: 2025-10-02 12:12:50.167622911 +0000 UTC m=+0.073183646 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Oct  2 08:12:50 np0005466012 podman[228449]: 2025-10-02 12:12:50.189575304 +0000 UTC m=+0.093709965 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:12:50 np0005466012 nova_compute[192063]: 2025-10-02 12:12:50.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:53 np0005466012 nova_compute[192063]: 2025-10-02 12:12:53.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:54 np0005466012 podman[228493]: 2025-10-02 12:12:54.162486732 +0000 UTC m=+0.071537906 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:12:54 np0005466012 podman[228492]: 2025-10-02 12:12:54.178947959 +0000 UTC m=+0.094972284 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.223 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.223 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.251 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.357 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.358 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.365 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.365 2 INFO nova.compute.claims [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.642 2 DEBUG nova.compute.provider_tree [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.675 2 DEBUG nova.scheduler.client.report [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.733 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.734 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.805 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.828 2 INFO nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:12:55 np0005466012 nova_compute[192063]: 2025-10-02 12:12:55.851 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.021 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.024 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.024 2 INFO nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Creating image(s)#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.025 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "/var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.026 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "/var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.027 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "/var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.055 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.149 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.151 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.153 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.178 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.257 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.258 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.471 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk 1073741824" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.473 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.473 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.534 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.535 2 DEBUG nova.virt.disk.api [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Checking if we can resize image /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.535 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.592 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.593 2 DEBUG nova.virt.disk.api [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Cannot resize image /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.594 2 DEBUG nova.objects.instance [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lazy-loading 'migration_context' on Instance uuid 4f0d09cf-f201-416c-806d-cc3b077ad6c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.611 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.612 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Ensure instance console log exists: /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.612 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.612 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.613 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.614 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.622 2 WARNING nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.627 2 DEBUG nova.virt.libvirt.host [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.628 2 DEBUG nova.virt.libvirt.host [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.631 2 DEBUG nova.virt.libvirt.host [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.632 2 DEBUG nova.virt.libvirt.host [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.634 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.634 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.635 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.635 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.635 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.635 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.635 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.636 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.636 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.636 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.636 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.637 2 DEBUG nova.virt.hardware [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.640 2 DEBUG nova.objects.instance [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f0d09cf-f201-416c-806d-cc3b077ad6c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.665 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <uuid>4f0d09cf-f201-416c-806d-cc3b077ad6c0</uuid>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <name>instance-00000040</name>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersAaction247Test-server-7410198</nova:name>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:12:56</nova:creationTime>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:user uuid="8eeb001553884a4ba8ab5e68e40e1ecd">tempest-ServersAaction247Test-765175336-project-member</nova:user>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:        <nova:project uuid="69082524d3d94dd9ba9c97c4ac0beb67">tempest-ServersAaction247Test-765175336</nova:project>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <entry name="serial">4f0d09cf-f201-416c-806d-cc3b077ad6c0</entry>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <entry name="uuid">4f0d09cf-f201-416c-806d-cc3b077ad6c0</entry>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.config"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/console.log" append="off"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:12:56 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:12:56 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:12:56 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:12:56 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.749 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.753 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.753 2 INFO nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Using config drive#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.965 2 INFO nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Creating config drive at /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.config#033[00m
Oct  2 08:12:56 np0005466012 nova_compute[192063]: 2025-10-02 12:12:56.973 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdrzesx28 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:12:57 np0005466012 nova_compute[192063]: 2025-10-02 12:12:57.115 2 DEBUG oslo_concurrency.processutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdrzesx28" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:12:57 np0005466012 systemd-machined[152114]: New machine qemu-27-instance-00000040.
Oct  2 08:12:57 np0005466012 systemd[1]: Started Virtual Machine qemu-27-instance-00000040.
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.322 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407178.321768, 4f0d09cf-f201-416c-806d-cc3b077ad6c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.323 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.326 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.326 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.330 2 INFO nova.virt.libvirt.driver [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Instance spawned successfully.#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.331 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.348 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.351 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.360 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.360 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.361 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.361 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.362 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.362 2 DEBUG nova.virt.libvirt.driver [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.371 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.371 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407178.3231678, 4f0d09cf-f201-416c-806d-cc3b077ad6c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.371 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] VM Started (Lifecycle Event)#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.398 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.401 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.425 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.439 2 INFO nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Took 2.42 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.439 2 DEBUG nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.523 2 INFO nova.compute.manager [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Took 3.20 seconds to build instance.#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.532 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407163.5315084, 0753ad57-d509-4a98-bba1-e9b29c087474 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.532 2 INFO nova.compute.manager [-] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.548 2 DEBUG oslo_concurrency.lockutils [None req-621f933b-d4af-4753-9947-303b8c7ac737 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.549 2 DEBUG nova.compute.manager [None req-9a09dcf3-43ca-401c-810c-e67649f2ad90 - - - - - -] [instance: 0753ad57-d509-4a98-bba1-e9b29c087474] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:12:58 np0005466012 nova_compute[192063]: 2025-10-02 12:12:58.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466012 nova_compute[192063]: 2025-10-02 12:13:00.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:00 np0005466012 nova_compute[192063]: 2025-10-02 12:13:00.985 2 DEBUG nova.compute.manager [None req-91b21b56-5328-4699-b43c-c10a63eee167 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.042 2 INFO nova.compute.manager [None req-91b21b56-5328-4699-b43c-c10a63eee167 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] instance snapshotting#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.043 2 DEBUG nova.objects.instance [None req-91b21b56-5328-4699-b43c-c10a63eee167 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lazy-loading 'flavor' on Instance uuid 4f0d09cf-f201-416c-806d-cc3b077ad6c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.159 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.159 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.160 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.160 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.160 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.173 2 INFO nova.compute.manager [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Terminating instance#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.183 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "refresh_cache-4f0d09cf-f201-416c-806d-cc3b077ad6c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.184 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquired lock "refresh_cache-4f0d09cf-f201-416c-806d-cc3b077ad6c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.184 2 DEBUG nova.network.neutron [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.312 2 INFO nova.virt.libvirt.driver [None req-91b21b56-5328-4699-b43c-c10a63eee167 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Beginning live snapshot process#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.359 2 DEBUG nova.compute.manager [None req-91b21b56-5328-4699-b43c-c10a63eee167 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.679 2 DEBUG nova.network.neutron [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.812 2 DEBUG nova.compute.manager [None req-91b21b56-5328-4699-b43c-c10a63eee167 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.963 2 DEBUG nova.network.neutron [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.979 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Releasing lock "refresh_cache-4f0d09cf-f201-416c-806d-cc3b077ad6c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:01 np0005466012 nova_compute[192063]: 2025-10-02 12:13:01.980 2 DEBUG nova.compute.manager [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:02 np0005466012 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000040.scope: Deactivated successfully.
Oct  2 08:13:02 np0005466012 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000040.scope: Consumed 4.728s CPU time.
Oct  2 08:13:02 np0005466012 systemd-machined[152114]: Machine qemu-27-instance-00000040 terminated.
Oct  2 08:13:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:02.123 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:02.124 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:02.124 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.229 2 INFO nova.virt.libvirt.driver [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Instance destroyed successfully.#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.230 2 DEBUG nova.objects.instance [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lazy-loading 'resources' on Instance uuid 4f0d09cf-f201-416c-806d-cc3b077ad6c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.248 2 INFO nova.virt.libvirt.driver [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Deleting instance files /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0_del#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.249 2 INFO nova.virt.libvirt.driver [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Deletion of /var/lib/nova/instances/4f0d09cf-f201-416c-806d-cc3b077ad6c0_del complete#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.331 2 INFO nova.compute.manager [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.332 2 DEBUG oslo.service.loopingcall [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.332 2 DEBUG nova.compute.manager [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.333 2 DEBUG nova.network.neutron [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.517 2 DEBUG nova.network.neutron [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.533 2 DEBUG nova.network.neutron [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.546 2 INFO nova.compute.manager [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Took 0.21 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.637 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.638 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.693 2 DEBUG nova.compute.provider_tree [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.724 2 DEBUG nova.scheduler.client.report [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.768 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.805 2 INFO nova.scheduler.client.report [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Deleted allocations for instance 4f0d09cf-f201-416c-806d-cc3b077ad6c0#033[00m
Oct  2 08:13:02 np0005466012 nova_compute[192063]: 2025-10-02 12:13:02.884 2 DEBUG oslo_concurrency.lockutils [None req-196266ae-684d-49d7-a46e-7c9e7b5249e8 8eeb001553884a4ba8ab5e68e40e1ecd 69082524d3d94dd9ba9c97c4ac0beb67 - - default default] Lock "4f0d09cf-f201-416c-806d-cc3b077ad6c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:03 np0005466012 nova_compute[192063]: 2025-10-02 12:13:03.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:05 np0005466012 nova_compute[192063]: 2025-10-02 12:13:05.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:08 np0005466012 podman[228594]: 2025-10-02 12:13:08.141410891 +0000 UTC m=+0.058873423 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:13:08 np0005466012 podman[228595]: 2025-10-02 12:13:08.211812201 +0000 UTC m=+0.121374944 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:13:08 np0005466012 nova_compute[192063]: 2025-10-02 12:13:08.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:10 np0005466012 nova_compute[192063]: 2025-10-02 12:13:10.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:11 np0005466012 podman[228642]: 2025-10-02 12:13:11.138200112 +0000 UTC m=+0.055181540 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:13:13 np0005466012 nova_compute[192063]: 2025-10-02 12:13:13.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:14 np0005466012 podman[228661]: 2025-10-02 12:13:14.186990248 +0000 UTC m=+0.101029779 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:15 np0005466012 nova_compute[192063]: 2025-10-02 12:13:15.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:16 np0005466012 nova_compute[192063]: 2025-10-02 12:13:16.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:16 np0005466012 nova_compute[192063]: 2025-10-02 12:13:16.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:13:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:13:17 np0005466012 nova_compute[192063]: 2025-10-02 12:13:17.227 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407182.2258759, 4f0d09cf-f201-416c-806d-cc3b077ad6c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:17 np0005466012 nova_compute[192063]: 2025-10-02 12:13:17.228 2 INFO nova.compute.manager [-] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:13:17 np0005466012 nova_compute[192063]: 2025-10-02 12:13:17.251 2 DEBUG nova.compute.manager [None req-ce2ce1a7-61ad-48b1-bb5f-b086832f593c - - - - - -] [instance: 4f0d09cf-f201-416c-806d-cc3b077ad6c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:17 np0005466012 nova_compute[192063]: 2025-10-02 12:13:17.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:17 np0005466012 nova_compute[192063]: 2025-10-02 12:13:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:17 np0005466012 nova_compute[192063]: 2025-10-02 12:13:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:18 np0005466012 nova_compute[192063]: 2025-10-02 12:13:18.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:18 np0005466012 nova_compute[192063]: 2025-10-02 12:13:18.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:18 np0005466012 nova_compute[192063]: 2025-10-02 12:13:18.846 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:18 np0005466012 nova_compute[192063]: 2025-10-02 12:13:18.846 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:18 np0005466012 nova_compute[192063]: 2025-10-02 12:13:18.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:18 np0005466012 nova_compute[192063]: 2025-10-02 12:13:18.847 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.032 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.033 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=73.42827606201172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.033 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.034 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.116 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.116 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.142 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.162 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.184 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:13:19 np0005466012 nova_compute[192063]: 2025-10-02 12:13:19.184 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:20 np0005466012 nova_compute[192063]: 2025-10-02 12:13:20.186 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:20 np0005466012 nova_compute[192063]: 2025-10-02 12:13:20.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:21 np0005466012 podman[228682]: 2025-10-02 12:13:21.152259311 +0000 UTC m=+0.065604766 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:13:21 np0005466012 podman[228683]: 2025-10-02 12:13:21.158149639 +0000 UTC m=+0.059683967 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:13:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:22.234 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:22 np0005466012 nova_compute[192063]: 2025-10-02 12:13:22.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:22.235 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:13:22 np0005466012 nova_compute[192063]: 2025-10-02 12:13:22.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:22 np0005466012 nova_compute[192063]: 2025-10-02 12:13:22.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:13:23 np0005466012 nova_compute[192063]: 2025-10-02 12:13:23.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:24.238 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:25 np0005466012 podman[228723]: 2025-10-02 12:13:25.130663962 +0000 UTC m=+0.051548910 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:13:25 np0005466012 podman[228724]: 2025-10-02 12:13:25.132732505 +0000 UTC m=+0.050409266 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.534 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.534 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.554 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.723 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.723 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.768 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.769 2 INFO nova.compute.claims [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.935 2 DEBUG nova.compute.provider_tree [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.949 2 DEBUG nova.scheduler.client.report [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.987 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:25 np0005466012 nova_compute[192063]: 2025-10-02 12:13:25.988 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.074 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.075 2 DEBUG nova.network.neutron [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.117 2 INFO nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.131 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.240 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.241 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.242 2 INFO nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Creating image(s)#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.242 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "/var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.242 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "/var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.243 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "/var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.255 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.316 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.317 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.318 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.342 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.389 2 DEBUG nova.policy [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '36d9cf55d7294b22a8f5e859077fb9f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2244adeca2c4be2ae64f12af556e7de', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.426 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.427 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.949 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk 1073741824" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.950 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:26 np0005466012 nova_compute[192063]: 2025-10-02 12:13:26.951 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.014 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.015 2 DEBUG nova.virt.disk.api [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Checking if we can resize image /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.016 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.074 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.075 2 DEBUG nova.virt.disk.api [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Cannot resize image /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.075 2 DEBUG nova.objects.instance [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lazy-loading 'migration_context' on Instance uuid 8a3a4249-c00f-498a-8a14-6b0862928cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.087 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.087 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Ensure instance console log exists: /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.088 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.088 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.088 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.839 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:13:27 np0005466012 nova_compute[192063]: 2025-10-02 12:13:27.839 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:13:28 np0005466012 nova_compute[192063]: 2025-10-02 12:13:28.150 2 DEBUG nova.network.neutron [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Successfully created port: 044042da-3912-41dc-be5e-e0a565776c37 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:13:28 np0005466012 nova_compute[192063]: 2025-10-02 12:13:28.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:29 np0005466012 nova_compute[192063]: 2025-10-02 12:13:29.214 2 DEBUG nova.network.neutron [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Successfully updated port: 044042da-3912-41dc-be5e-e0a565776c37 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:13:29 np0005466012 nova_compute[192063]: 2025-10-02 12:13:29.242 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "refresh_cache-8a3a4249-c00f-498a-8a14-6b0862928cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:29 np0005466012 nova_compute[192063]: 2025-10-02 12:13:29.243 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquired lock "refresh_cache-8a3a4249-c00f-498a-8a14-6b0862928cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:29 np0005466012 nova_compute[192063]: 2025-10-02 12:13:29.243 2 DEBUG nova.network.neutron [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:13:29 np0005466012 nova_compute[192063]: 2025-10-02 12:13:29.571 2 DEBUG nova.network.neutron [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.298 2 DEBUG nova.compute.manager [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received event network-changed-044042da-3912-41dc-be5e-e0a565776c37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.298 2 DEBUG nova.compute.manager [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Refreshing instance network info cache due to event network-changed-044042da-3912-41dc-be5e-e0a565776c37. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.299 2 DEBUG oslo_concurrency.lockutils [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-8a3a4249-c00f-498a-8a14-6b0862928cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.973 2 DEBUG nova.network.neutron [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Updating instance_info_cache with network_info: [{"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.996 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Releasing lock "refresh_cache-8a3a4249-c00f-498a-8a14-6b0862928cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.997 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Instance network_info: |[{"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.997 2 DEBUG oslo_concurrency.lockutils [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-8a3a4249-c00f-498a-8a14-6b0862928cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:30 np0005466012 nova_compute[192063]: 2025-10-02 12:13:30.997 2 DEBUG nova.network.neutron [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Refreshing network info cache for port 044042da-3912-41dc-be5e-e0a565776c37 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.000 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Start _get_guest_xml network_info=[{"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.004 2 WARNING nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.008 2 DEBUG nova.virt.libvirt.host [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.009 2 DEBUG nova.virt.libvirt.host [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.011 2 DEBUG nova.virt.libvirt.host [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.011 2 DEBUG nova.virt.libvirt.host [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.012 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.013 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.013 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.013 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.014 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.014 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.014 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.014 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.014 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.015 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.015 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.015 2 DEBUG nova.virt.hardware [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.018 2 DEBUG nova.virt.libvirt.vif [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-915184330',display_name='tempest-ServerAddressesNegativeTestJSON-server-915184330',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-915184330',id=67,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2244adeca2c4be2ae64f12af556e7de',ramdisk_id='',reservation_id='r-gfwimcjf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-537768099',owner_user_name='tempest-ServerAddressesNegativeTestJSON-537768099-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:26Z,user_data=None,user_id='36d9cf55d7294b22a8f5e859077fb9f5',uuid=8a3a4249-c00f-498a-8a14-6b0862928cb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.018 2 DEBUG nova.network.os_vif_util [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Converting VIF {"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.019 2 DEBUG nova.network.os_vif_util [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.020 2 DEBUG nova.objects.instance [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a3a4249-c00f-498a-8a14-6b0862928cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.038 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <uuid>8a3a4249-c00f-498a-8a14-6b0862928cb5</uuid>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <name>instance-00000043</name>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-915184330</nova:name>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:13:31</nova:creationTime>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:user uuid="36d9cf55d7294b22a8f5e859077fb9f5">tempest-ServerAddressesNegativeTestJSON-537768099-project-member</nova:user>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:project uuid="c2244adeca2c4be2ae64f12af556e7de">tempest-ServerAddressesNegativeTestJSON-537768099</nova:project>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        <nova:port uuid="044042da-3912-41dc-be5e-e0a565776c37">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <entry name="serial">8a3a4249-c00f-498a-8a14-6b0862928cb5</entry>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <entry name="uuid">8a3a4249-c00f-498a-8a14-6b0862928cb5</entry>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.config"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0e:f6:e9"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <target dev="tap044042da-39"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/console.log" append="off"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:13:31 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:13:31 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:13:31 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:13:31 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.040 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Preparing to wait for external event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.040 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.040 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.040 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.041 2 DEBUG nova.virt.libvirt.vif [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-915184330',display_name='tempest-ServerAddressesNegativeTestJSON-server-915184330',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-915184330',id=67,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2244adeca2c4be2ae64f12af556e7de',ramdisk_id='',reservation_id='r-gfwimcjf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-537768099',owner_user_name='tempest-ServerAddressesNegativeTestJSON-537768099-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:26Z,user_data=None,user_id='36d9cf55d7294b22a8f5e859077fb9f5',uuid=8a3a4249-c00f-498a-8a14-6b0862928cb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.041 2 DEBUG nova.network.os_vif_util [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Converting VIF {"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.042 2 DEBUG nova.network.os_vif_util [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.042 2 DEBUG os_vif [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.043 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.044 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.047 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap044042da-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.047 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap044042da-39, col_values=(('external_ids', {'iface-id': '044042da-3912-41dc-be5e-e0a565776c37', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:f6:e9', 'vm-uuid': '8a3a4249-c00f-498a-8a14-6b0862928cb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005466012 NetworkManager[51207]: <info>  [1759407211.0497] manager: (tap044042da-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.058 2 INFO os_vif [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39')#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.608 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.608 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.609 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] No VIF found with MAC fa:16:3e:0e:f6:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:13:31 np0005466012 nova_compute[192063]: 2025-10-02 12:13:31.609 2 INFO nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Using config drive#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.141 2 INFO nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Creating config drive at /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.config#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.146 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvwmngmrn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.269 2 DEBUG oslo_concurrency.processutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvwmngmrn" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:32 np0005466012 kernel: tap044042da-39: entered promiscuous mode
Oct  2 08:13:32 np0005466012 NetworkManager[51207]: <info>  [1759407212.3301] manager: (tap044042da-39): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Oct  2 08:13:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:32Z|00241|binding|INFO|Claiming lport 044042da-3912-41dc-be5e-e0a565776c37 for this chassis.
Oct  2 08:13:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:32Z|00242|binding|INFO|044042da-3912-41dc-be5e-e0a565776c37: Claiming fa:16:3e:0e:f6:e9 10.100.0.7
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.342 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:f6:e9 10.100.0.7'], port_security=['fa:16:3e:0e:f6:e9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8a3a4249-c00f-498a-8a14-6b0862928cb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2244adeca2c4be2ae64f12af556e7de', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c412c4fe-e1a1-4707-8999-3c91b626245a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54cef3a7-4c2e-476b-a75e-58976f479894, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=044042da-3912-41dc-be5e-e0a565776c37) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.343 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 044042da-3912-41dc-be5e-e0a565776c37 in datapath 789483e6-d1ce-47d0-8925-2a0548ba4e19 bound to our chassis#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.345 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 789483e6-d1ce-47d0-8925-2a0548ba4e19#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.356 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[52dd3303-b77c-40a2-8a10-12089c011fd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.356 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap789483e6-d1 in ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:13:32 np0005466012 systemd-udevd[228798]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.359 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap789483e6-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.359 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f2003036-d2b2-4282-8a13-dbf6d7861718]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.361 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3d0f76-4bc8-413a-98a3-068a76399dbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 NetworkManager[51207]: <info>  [1759407212.3705] device (tap044042da-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:13:32 np0005466012 NetworkManager[51207]: <info>  [1759407212.3714] device (tap044042da-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.376 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d411c4-c27d-4543-bad3-70c2bbdabcdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 systemd-machined[152114]: New machine qemu-28-instance-00000043.
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.404 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0df700ff-d828-4c2a-b680-a6f3c3eaaa02]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:32Z|00243|binding|INFO|Setting lport 044042da-3912-41dc-be5e-e0a565776c37 ovn-installed in OVS
Oct  2 08:13:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:32Z|00244|binding|INFO|Setting lport 044042da-3912-41dc-be5e-e0a565776c37 up in Southbound
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 systemd[1]: Started Virtual Machine qemu-28-instance-00000043.
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.436 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[97ef7cc6-044e-41da-bf94-0624885ec0b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.442 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc5c1c1-3269-4c2a-98b7-23fe57644ad3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 NetworkManager[51207]: <info>  [1759407212.4432] manager: (tap789483e6-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Oct  2 08:13:32 np0005466012 systemd-udevd[228803]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.482 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d094244f-213f-48cb-bea9-b9b5d252564f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.485 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1e05aa17-a770-4736-aa97-8a6c97bb3649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 NetworkManager[51207]: <info>  [1759407212.5090] device (tap789483e6-d0): carrier: link connected
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.514 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[36778193-75e1-43bf-8fab-865a05b048e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.530 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbef5da-c583-49b7-82b9-3924178350b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap789483e6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:d9:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520612, 'reachable_time': 37399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228833, 'error': None, 'target': 'ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.550 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[414d8830-161d-4e02-956b-5226fb496a37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:d939'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520612, 'tstamp': 520612}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228834, 'error': None, 'target': 'ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.568 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0bf90d-c00e-449d-934e-d66adb851046]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap789483e6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:d9:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520612, 'reachable_time': 37399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228835, 'error': None, 'target': 'ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.605 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[689f8072-1d78-4e54-bbf3-5cf3e29656d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.661 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[85e9f3b2-e47b-4ba7-99a3-b51d22fca1b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.662 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap789483e6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.663 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.663 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap789483e6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 NetworkManager[51207]: <info>  [1759407212.6666] manager: (tap789483e6-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Oct  2 08:13:32 np0005466012 kernel: tap789483e6-d0: entered promiscuous mode
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.671 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap789483e6-d0, col_values=(('external_ids', {'iface-id': 'a22736db-0ea8-4ec6-a4a9-5d2497c250e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:32Z|00245|binding|INFO|Releasing lport a22736db-0ea8-4ec6-a4a9-5d2497c250e8 from this chassis (sb_readonly=0)
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.690 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/789483e6-d1ce-47d0-8925-2a0548ba4e19.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/789483e6-d1ce-47d0-8925-2a0548ba4e19.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.691 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ceb6749-038d-4353-9ad0-cc4780a69b1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.691 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-789483e6-d1ce-47d0-8925-2a0548ba4e19
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/789483e6-d1ce-47d0-8925-2a0548ba4e19.pid.haproxy
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 789483e6-d1ce-47d0-8925-2a0548ba4e19
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:13:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:32.692 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'env', 'PROCESS_TAG=haproxy-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/789483e6-d1ce-47d0-8925-2a0548ba4e19.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.716 2 DEBUG nova.network.neutron [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Updated VIF entry in instance network info cache for port 044042da-3912-41dc-be5e-e0a565776c37. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.717 2 DEBUG nova.network.neutron [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Updating instance_info_cache with network_info: [{"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.829 2 DEBUG nova.compute.manager [req-20769041-33eb-4632-a1e1-62185a77473d req-26f9d005-3c19-480a-ab36-2b46da890d17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.830 2 DEBUG oslo_concurrency.lockutils [req-20769041-33eb-4632-a1e1-62185a77473d req-26f9d005-3c19-480a-ab36-2b46da890d17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.830 2 DEBUG oslo_concurrency.lockutils [req-20769041-33eb-4632-a1e1-62185a77473d req-26f9d005-3c19-480a-ab36-2b46da890d17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.831 2 DEBUG oslo_concurrency.lockutils [req-20769041-33eb-4632-a1e1-62185a77473d req-26f9d005-3c19-480a-ab36-2b46da890d17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.831 2 DEBUG nova.compute.manager [req-20769041-33eb-4632-a1e1-62185a77473d req-26f9d005-3c19-480a-ab36-2b46da890d17 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Processing event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:13:32 np0005466012 nova_compute[192063]: 2025-10-02 12:13:32.850 2 DEBUG oslo_concurrency.lockutils [req-adc20ad3-9f7c-413b-827e-859956c4031c req-e5f143f8-59d0-4f9e-bd28-3e1aa6897fa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-8a3a4249-c00f-498a-8a14-6b0862928cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:33 np0005466012 podman[228873]: 2025-10-02 12:13:33.042935302 +0000 UTC m=+0.024306168 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.320 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407213.3199952, 8a3a4249-c00f-498a-8a14-6b0862928cb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.320 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.322 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.325 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.328 2 INFO nova.virt.libvirt.driver [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Instance spawned successfully.#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.328 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:13:33 np0005466012 podman[228873]: 2025-10-02 12:13:33.342284347 +0000 UTC m=+0.323655213 container create b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.368 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.371 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.434 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.434 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407213.3221507, 8a3a4249-c00f-498a-8a14-6b0862928cb5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.434 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.439 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.439 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.440 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.440 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.441 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.441 2 DEBUG nova.virt.libvirt.driver [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:13:33 np0005466012 systemd[1]: Started libpod-conmon-b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce.scope.
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.488 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.491 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407213.3247173, 8a3a4249-c00f-498a-8a14-6b0862928cb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.492 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:33 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:13:33 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11b7c3e0a0b74ba90bc7e4323caa3fdea1e73bee2576f3633fa26b155fa93769/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.630 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.633 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.690 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.832 2 INFO nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Took 7.59 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:13:33 np0005466012 nova_compute[192063]: 2025-10-02 12:13:33.833 2 DEBUG nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:33 np0005466012 podman[228873]: 2025-10-02 12:13:33.839568501 +0000 UTC m=+0.820939377 container init b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:33 np0005466012 podman[228873]: 2025-10-02 12:13:33.84764771 +0000 UTC m=+0.829018556 container start b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:13:33 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [NOTICE]   (228892) : New worker (228894) forked
Oct  2 08:13:33 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [NOTICE]   (228892) : Loading success.
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.163 2 INFO nova.compute.manager [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Took 8.52 seconds to build instance.#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.267 2 DEBUG oslo_concurrency.lockutils [None req-44276076-a2e5-4ec7-9df5-92e253061c40 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.980 2 DEBUG nova.compute.manager [req-83b0ff85-da0a-4d12-b38f-eb552852697b req-04fe470f-66a2-4e4c-8b95-c162cb48688c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.981 2 DEBUG oslo_concurrency.lockutils [req-83b0ff85-da0a-4d12-b38f-eb552852697b req-04fe470f-66a2-4e4c-8b95-c162cb48688c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.982 2 DEBUG oslo_concurrency.lockutils [req-83b0ff85-da0a-4d12-b38f-eb552852697b req-04fe470f-66a2-4e4c-8b95-c162cb48688c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.982 2 DEBUG oslo_concurrency.lockutils [req-83b0ff85-da0a-4d12-b38f-eb552852697b req-04fe470f-66a2-4e4c-8b95-c162cb48688c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.982 2 DEBUG nova.compute.manager [req-83b0ff85-da0a-4d12-b38f-eb552852697b req-04fe470f-66a2-4e4c-8b95-c162cb48688c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] No waiting events found dispatching network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:34 np0005466012 nova_compute[192063]: 2025-10-02 12:13:34.983 2 WARNING nova.compute.manager [req-83b0ff85-da0a-4d12-b38f-eb552852697b req-04fe470f-66a2-4e4c-8b95-c162cb48688c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received unexpected event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:13:35 np0005466012 nova_compute[192063]: 2025-10-02 12:13:35.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.685 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.686 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.687 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.688 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.688 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.733 2 INFO nova.compute.manager [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Terminating instance#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.766 2 DEBUG nova.compute.manager [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:36 np0005466012 kernel: tap044042da-39 (unregistering): left promiscuous mode
Oct  2 08:13:36 np0005466012 NetworkManager[51207]: <info>  [1759407216.8424] device (tap044042da-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:36Z|00246|binding|INFO|Releasing lport 044042da-3912-41dc-be5e-e0a565776c37 from this chassis (sb_readonly=0)
Oct  2 08:13:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:36Z|00247|binding|INFO|Setting lport 044042da-3912-41dc-be5e-e0a565776c37 down in Southbound
Oct  2 08:13:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:36Z|00248|binding|INFO|Removing iface tap044042da-39 ovn-installed in OVS
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:36 np0005466012 nova_compute[192063]: 2025-10-02 12:13:36.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:36.867 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:f6:e9 10.100.0.7'], port_security=['fa:16:3e:0e:f6:e9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8a3a4249-c00f-498a-8a14-6b0862928cb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2244adeca2c4be2ae64f12af556e7de', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c412c4fe-e1a1-4707-8999-3c91b626245a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54cef3a7-4c2e-476b-a75e-58976f479894, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=044042da-3912-41dc-be5e-e0a565776c37) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:36.868 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 044042da-3912-41dc-be5e-e0a565776c37 in datapath 789483e6-d1ce-47d0-8925-2a0548ba4e19 unbound from our chassis#033[00m
Oct  2 08:13:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:36.869 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 789483e6-d1ce-47d0-8925-2a0548ba4e19, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:36.870 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[91e285f3-d5ad-4381-b30c-3fd900f91182]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:36.871 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19 namespace which is not needed anymore#033[00m
Oct  2 08:13:36 np0005466012 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000043.scope: Deactivated successfully.
Oct  2 08:13:36 np0005466012 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000043.scope: Consumed 4.258s CPU time.
Oct  2 08:13:36 np0005466012 systemd-machined[152114]: Machine qemu-28-instance-00000043 terminated.
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.027 2 INFO nova.virt.libvirt.driver [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Instance destroyed successfully.#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.028 2 DEBUG nova.objects.instance [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lazy-loading 'resources' on Instance uuid 8a3a4249-c00f-498a-8a14-6b0862928cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.060 2 DEBUG nova.virt.libvirt.vif [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:13:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-915184330',display_name='tempest-ServerAddressesNegativeTestJSON-server-915184330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-915184330',id=67,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2244adeca2c4be2ae64f12af556e7de',ramdisk_id='',reservation_id='r-gfwimcjf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-537768099',owner_user_name='tempest-ServerAddressesNegativeTestJSON-537768099-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:13:33Z,user_data=None,user_id='36d9cf55d7294b22a8f5e859077fb9f5',uuid=8a3a4249-c00f-498a-8a14-6b0862928cb5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.060 2 DEBUG nova.network.os_vif_util [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Converting VIF {"id": "044042da-3912-41dc-be5e-e0a565776c37", "address": "fa:16:3e:0e:f6:e9", "network": {"id": "789483e6-d1ce-47d0-8925-2a0548ba4e19", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-123585089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2244adeca2c4be2ae64f12af556e7de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap044042da-39", "ovs_interfaceid": "044042da-3912-41dc-be5e-e0a565776c37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.061 2 DEBUG nova.network.os_vif_util [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.061 2 DEBUG os_vif [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap044042da-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.068 2 INFO os_vif [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f6:e9,bridge_name='br-int',has_traffic_filtering=True,id=044042da-3912-41dc-be5e-e0a565776c37,network=Network(789483e6-d1ce-47d0-8925-2a0548ba4e19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap044042da-39')#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.069 2 INFO nova.virt.libvirt.driver [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Deleting instance files /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5_del#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.069 2 INFO nova.virt.libvirt.driver [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Deletion of /var/lib/nova/instances/8a3a4249-c00f-498a-8a14-6b0862928cb5_del complete#033[00m
Oct  2 08:13:37 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [NOTICE]   (228892) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:37 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [NOTICE]   (228892) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:37 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [WARNING]  (228892) : Exiting Master process...
Oct  2 08:13:37 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [WARNING]  (228892) : Exiting Master process...
Oct  2 08:13:37 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [ALERT]    (228892) : Current worker (228894) exited with code 143 (Terminated)
Oct  2 08:13:37 np0005466012 neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19[228888]: [WARNING]  (228892) : All workers exited. Exiting... (0)
Oct  2 08:13:37 np0005466012 systemd[1]: libpod-b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce.scope: Deactivated successfully.
Oct  2 08:13:37 np0005466012 podman[228927]: 2025-10-02 12:13:37.160195384 +0000 UTC m=+0.207556168 container died b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.237 2 INFO nova.compute.manager [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.238 2 DEBUG oslo.service.loopingcall [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.239 2 DEBUG nova.compute.manager [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:37 np0005466012 nova_compute[192063]: 2025-10-02 12:13:37.239 2 DEBUG nova.network.neutron [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:37 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:37 np0005466012 systemd[1]: var-lib-containers-storage-overlay-11b7c3e0a0b74ba90bc7e4323caa3fdea1e73bee2576f3633fa26b155fa93769-merged.mount: Deactivated successfully.
Oct  2 08:13:37 np0005466012 podman[228927]: 2025-10-02 12:13:37.878748745 +0000 UTC m=+0.926109499 container cleanup b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:13:37 np0005466012 systemd[1]: libpod-conmon-b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce.scope: Deactivated successfully.
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.460 2 DEBUG nova.network.neutron [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:38 np0005466012 podman[228974]: 2025-10-02 12:13:38.602050103 +0000 UTC m=+0.704038598 container remove b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.605 2 INFO nova.compute.manager [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Took 1.37 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.609 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8bcd05-c077-4db0-be4e-05b30fc32405]: (4, ('Thu Oct  2 12:13:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19 (b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce)\nb064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce\nThu Oct  2 12:13:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19 (b064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce)\nb064e22e43ac060f9b1e160c7ea87178832eebb6018de9aad2261369ee8036ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.612 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d458c6-51cb-4823-a044-23b9400be6d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.613 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap789483e6-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:38 np0005466012 kernel: tap789483e6-d0: left promiscuous mode
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.636 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0ef604-71d8-495e-8a69-c0b298fe5bc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.665 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0f5fdc3b-32a9-499f-bbe1-5c44eded6192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.666 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[57d4ce71-f30b-46c0-a5a2-e578235a2fa1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.687 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8c34d0be-b662-4431-9a38-ea9d8638e7cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520604, 'reachable_time': 27109, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229002, 'error': None, 'target': 'ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.690 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-789483e6-d1ce-47d0-8925-2a0548ba4e19 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:38.690 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[34ced103-a944-45ca-8c35-c2fa233edffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:38 np0005466012 systemd[1]: run-netns-ovnmeta\x2d789483e6\x2dd1ce\x2d47d0\x2d8925\x2d2a0548ba4e19.mount: Deactivated successfully.
Oct  2 08:13:38 np0005466012 podman[228988]: 2025-10-02 12:13:38.727888343 +0000 UTC m=+0.063921808 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.747 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:38 np0005466012 podman[228990]: 2025-10-02 12:13:38.748435197 +0000 UTC m=+0.085425938 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.748 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.866 2 DEBUG nova.compute.manager [req-817d90ef-4be8-4a62-bd48-0d34911c2a70 req-fc266da5-ac05-4d49-86ca-7b341ac406d4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received event network-vif-deleted-044042da-3912-41dc-be5e-e0a565776c37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.872 2 DEBUG nova.compute.provider_tree [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.894 2 DEBUG nova.scheduler.client.report [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.930 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:38 np0005466012 nova_compute[192063]: 2025-10-02 12:13:38.982 2 INFO nova.scheduler.client.report [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Deleted allocations for instance 8a3a4249-c00f-498a-8a14-6b0862928cb5#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.094 2 DEBUG oslo_concurrency.lockutils [None req-c7a93e3c-b272-40d3-80c3-f8a9c68710c0 36d9cf55d7294b22a8f5e859077fb9f5 c2244adeca2c4be2ae64f12af556e7de - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.303 2 DEBUG nova.compute.manager [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received event network-vif-unplugged-044042da-3912-41dc-be5e-e0a565776c37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.303 2 DEBUG oslo_concurrency.lockutils [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.303 2 DEBUG oslo_concurrency.lockutils [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.304 2 DEBUG oslo_concurrency.lockutils [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.304 2 DEBUG nova.compute.manager [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] No waiting events found dispatching network-vif-unplugged-044042da-3912-41dc-be5e-e0a565776c37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.304 2 WARNING nova.compute.manager [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received unexpected event network-vif-unplugged-044042da-3912-41dc-be5e-e0a565776c37 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.304 2 DEBUG nova.compute.manager [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.304 2 DEBUG oslo_concurrency.lockutils [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.305 2 DEBUG oslo_concurrency.lockutils [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.305 2 DEBUG oslo_concurrency.lockutils [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8a3a4249-c00f-498a-8a14-6b0862928cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.305 2 DEBUG nova.compute.manager [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] No waiting events found dispatching network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:39 np0005466012 nova_compute[192063]: 2025-10-02 12:13:39.305 2 WARNING nova.compute.manager [req-d271a02f-86c4-4590-be3b-82d795c6899b req-ffdd918f-73fc-40a8-8058-ed3288e41c6e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Received unexpected event network-vif-plugged-044042da-3912-41dc-be5e-e0a565776c37 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:13:40 np0005466012 nova_compute[192063]: 2025-10-02 12:13:40.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:42 np0005466012 nova_compute[192063]: 2025-10-02 12:13:42.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:42 np0005466012 podman[229041]: 2025-10-02 12:13:42.148866046 +0000 UTC m=+0.060692631 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:13:43 np0005466012 nova_compute[192063]: 2025-10-02 12:13:43.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.584 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.584 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.605 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.758 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.758 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.763 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.764 2 INFO nova.compute.claims [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.876 2 DEBUG nova.compute.provider_tree [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.901 2 DEBUG nova.scheduler.client.report [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.923 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.924 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.966 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.967 2 DEBUG nova.network.neutron [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.982 2 INFO nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:13:44 np0005466012 nova_compute[192063]: 2025-10-02 12:13:44.997 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.136 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.137 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.138 2 INFO nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Creating image(s)#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.138 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "/var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.139 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "/var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.140 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "/var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.140 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "aa8a26f21a89d4b2e2a08906454e4360ce404b25" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.140 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "aa8a26f21a89d4b2e2a08906454e4360ce404b25" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:45 np0005466012 podman[229060]: 2025-10-02 12:13:45.173252566 +0000 UTC m=+0.085609364 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.352 2 DEBUG nova.policy [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcdfc3c0f94e42cb931d27f2e3b5b12d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dcf78460093d411988a54040ea4c265a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:13:45 np0005466012 nova_compute[192063]: 2025-10-02 12:13:45.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:46 np0005466012 nova_compute[192063]: 2025-10-02 12:13:46.942 2 DEBUG nova.network.neutron [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Successfully created port: cedd012a-9c66-4c4e-abaf-616ff8e2a95c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.208 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.265 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.part --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.266 2 DEBUG nova.virt.images [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] 3c689ccb-5eb5-4436-85d7-552ebca0cc4a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.321 2 DEBUG nova.privsep.utils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.322 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.part /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:47 np0005466012 nova_compute[192063]: 2025-10-02 12:13:47.992 2 DEBUG nova.network.neutron [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Successfully updated port: cedd012a-9c66-4c4e-abaf-616ff8e2a95c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.022 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "refresh_cache-fd1711af-5e98-4ad5-a746-8aab0f033256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.022 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquired lock "refresh_cache-fd1711af-5e98-4ad5-a746-8aab0f033256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.023 2 DEBUG nova.network.neutron [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.167 2 DEBUG nova.compute.manager [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received event network-changed-cedd012a-9c66-4c4e-abaf-616ff8e2a95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.168 2 DEBUG nova.compute.manager [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Refreshing instance network info cache due to event network-changed-cedd012a-9c66-4c4e-abaf-616ff8e2a95c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.169 2 DEBUG oslo_concurrency.lockutils [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fd1711af-5e98-4ad5-a746-8aab0f033256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.286 2 DEBUG nova.network.neutron [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.350 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.part /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.converted" returned: 0 in 1.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.359 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.422 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25.converted --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.423 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "aa8a26f21a89d4b2e2a08906454e4360ce404b25" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.435 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.491 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.492 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "aa8a26f21a89d4b2e2a08906454e4360ce404b25" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.493 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "aa8a26f21a89d4b2e2a08906454e4360ce404b25" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.503 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.556 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:48 np0005466012 nova_compute[192063]: 2025-10-02 12:13:48.559 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25,backing_fmt=raw /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.236 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25,backing_fmt=raw /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk 1073741824" returned: 0 in 0.677s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.238 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "aa8a26f21a89d4b2e2a08906454e4360ce404b25" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.239 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.335 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.336 2 DEBUG nova.objects.instance [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'migration_context' on Instance uuid fd1711af-5e98-4ad5-a746-8aab0f033256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.355 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.356 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Ensure instance console log exists: /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.357 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.357 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:49 np0005466012 nova_compute[192063]: 2025-10-02 12:13:49.357 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.019 2 DEBUG nova.network.neutron [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Updating instance_info_cache with network_info: [{"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.089 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Releasing lock "refresh_cache-fd1711af-5e98-4ad5-a746-8aab0f033256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.089 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Instance network_info: |[{"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.090 2 DEBUG oslo_concurrency.lockutils [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fd1711af-5e98-4ad5-a746-8aab0f033256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.090 2 DEBUG nova.network.neutron [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Refreshing network info cache for port cedd012a-9c66-4c4e-abaf-616ff8e2a95c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.092 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Start _get_guest_xml network_info=[{"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-10-02T12:13:32Z,direct_url=<?>,disk_format='qcow2',id=3c689ccb-5eb5-4436-85d7-552ebca0cc4a,min_disk=1,min_ram=0,name='tempest-test-snap-80080864',owner='dcf78460093d411988a54040ea4c265a',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-10-02T12:13:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '3c689ccb-5eb5-4436-85d7-552ebca0cc4a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.096 2 WARNING nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.102 2 DEBUG nova.virt.libvirt.host [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.103 2 DEBUG nova.virt.libvirt.host [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.110 2 DEBUG nova.virt.libvirt.host [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.111 2 DEBUG nova.virt.libvirt.host [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.112 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.112 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-10-02T12:13:32Z,direct_url=<?>,disk_format='qcow2',id=3c689ccb-5eb5-4436-85d7-552ebca0cc4a,min_disk=1,min_ram=0,name='tempest-test-snap-80080864',owner='dcf78460093d411988a54040ea4c265a',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-10-02T12:13:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.113 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.113 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.113 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.113 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.113 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.114 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.114 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.114 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.114 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.114 2 DEBUG nova.virt.hardware [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.117 2 DEBUG nova.virt.libvirt.vif [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-907012627',display_name='tempest-ImagesTestJSON-server-907012627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-907012627',id=69,image_ref='3c689ccb-5eb5-4436-85d7-552ebca0cc4a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dcf78460093d411988a54040ea4c265a',ramdisk_id='',reservation_id='r-sdyek3fe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='3277cbd6-2706-4647-b0df-b789c49f80ea',image_min_disk='1',image_min_ram='0',image_owner_id='dcf78460093d411988a54040ea4c265a',image_owner_project_name='tempest-ImagesTestJSON-437970487',image_owner_user_name='tempest-ImagesTestJSON-437970487-project-member',image_user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-437970487',owner_user_name='tempest-ImagesTestJSON-437970487-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:45Z,user_data=None,user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',uuid=fd1711af-5e98-4ad5-a746-8aab0f033256,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.118 2 DEBUG nova.network.os_vif_util [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converting VIF {"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.118 2 DEBUG nova.network.os_vif_util [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.119 2 DEBUG nova.objects.instance [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'pci_devices' on Instance uuid fd1711af-5e98-4ad5-a746-8aab0f033256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.164 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <uuid>fd1711af-5e98-4ad5-a746-8aab0f033256</uuid>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <name>instance-00000045</name>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:name>tempest-ImagesTestJSON-server-907012627</nova:name>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:13:50</nova:creationTime>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:user uuid="dcdfc3c0f94e42cb931d27f2e3b5b12d">tempest-ImagesTestJSON-437970487-project-member</nova:user>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:project uuid="dcf78460093d411988a54040ea4c265a">tempest-ImagesTestJSON-437970487</nova:project>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="3c689ccb-5eb5-4436-85d7-552ebca0cc4a"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        <nova:port uuid="cedd012a-9c66-4c4e-abaf-616ff8e2a95c">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <entry name="serial">fd1711af-5e98-4ad5-a746-8aab0f033256</entry>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <entry name="uuid">fd1711af-5e98-4ad5-a746-8aab0f033256</entry>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.config"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:2a:7b:79"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <target dev="tapcedd012a-9c"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/console.log" append="off"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:13:50 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:13:50 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:13:50 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:13:50 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.165 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Preparing to wait for external event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.166 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.166 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.166 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.167 2 DEBUG nova.virt.libvirt.vif [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:13:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-907012627',display_name='tempest-ImagesTestJSON-server-907012627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-907012627',id=69,image_ref='3c689ccb-5eb5-4436-85d7-552ebca0cc4a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dcf78460093d411988a54040ea4c265a',ramdisk_id='',reservation_id='r-sdyek3fe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='3277cbd6-2706-4647-b0df-b789c49f80ea',image_min_disk='1',image_min_ram='0',image_owner_id='dcf78460093d411988a54040ea4c265a',image_owner_project_name='tempest-ImagesTestJSON-437970487',image_owner_user_name='tempest-ImagesTestJSON-437970487-project-member',image_user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-437970487',owner_user_name='tempest-ImagesTestJSON-437970487-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:13:45Z,user_data=None,user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',uuid=fd1711af-5e98-4ad5-a746-8aab0f033256,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.167 2 DEBUG nova.network.os_vif_util [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converting VIF {"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.168 2 DEBUG nova.network.os_vif_util [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.168 2 DEBUG os_vif [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.169 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.169 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.172 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcedd012a-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.172 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcedd012a-9c, col_values=(('external_ids', {'iface-id': 'cedd012a-9c66-4c4e-abaf-616ff8e2a95c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:7b:79', 'vm-uuid': 'fd1711af-5e98-4ad5-a746-8aab0f033256'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:50 np0005466012 NetworkManager[51207]: <info>  [1759407230.1744] manager: (tapcedd012a-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.180 2 INFO os_vif [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c')#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.265 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.265 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.266 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] No VIF found with MAC fa:16:3e:2a:7b:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.266 2 INFO nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Using config drive#033[00m
Oct  2 08:13:50 np0005466012 nova_compute[192063]: 2025-10-02 12:13:50.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.104 2 INFO nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Creating config drive at /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.config#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.109 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplh1r7_lt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.234 2 DEBUG oslo_concurrency.processutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplh1r7_lt" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:13:51 np0005466012 kernel: tapcedd012a-9c: entered promiscuous mode
Oct  2 08:13:51 np0005466012 NetworkManager[51207]: <info>  [1759407231.3069] manager: (tapcedd012a-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Oct  2 08:13:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:51Z|00249|binding|INFO|Claiming lport cedd012a-9c66-4c4e-abaf-616ff8e2a95c for this chassis.
Oct  2 08:13:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:51Z|00250|binding|INFO|cedd012a-9c66-4c4e-abaf-616ff8e2a95c: Claiming fa:16:3e:2a:7b:79 10.100.0.10
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.324 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:7b:79 10.100.0.10'], port_security=['fa:16:3e:2a:7b:79 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fd1711af-5e98-4ad5-a746-8aab0f033256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dcf78460093d411988a54040ea4c265a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aacce687-8b76-4e90-b19c-0dd006394188', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24ae9888-31f5-4083-b5ee-e7ed6a1eee13, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=cedd012a-9c66-4c4e-abaf-616ff8e2a95c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.326 103246 INFO neutron.agent.ovn.metadata.agent [-] Port cedd012a-9c66-4c4e-abaf-616ff8e2a95c in datapath 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 bound to our chassis#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.327 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.339 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[adc1d2cb-bf4e-4a1c-a580-f3acf90f3032]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.339 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f195445-f1 in ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:13:51 np0005466012 systemd-udevd[229147]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.341 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f195445-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.342 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ea2c1332-658e-40c0-b483-abb64a6bebf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.343 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6b571c71-685a-4eb3-8353-9837b7dfc344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 NetworkManager[51207]: <info>  [1759407231.3539] device (tapcedd012a-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:13:51 np0005466012 NetworkManager[51207]: <info>  [1759407231.3550] device (tapcedd012a-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.356 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[98bba5f4-57d8-447c-ac02-1629d26fab66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 systemd-machined[152114]: New machine qemu-29-instance-00000045.
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.380 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2165d2-e904-4cd7-8698-605e71b73d69]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 systemd[1]: Started Virtual Machine qemu-29-instance-00000045.
Oct  2 08:13:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:51Z|00251|binding|INFO|Setting lport cedd012a-9c66-4c4e-abaf-616ff8e2a95c ovn-installed in OVS
Oct  2 08:13:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:51Z|00252|binding|INFO|Setting lport cedd012a-9c66-4c4e-abaf-616ff8e2a95c up in Southbound
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 podman[229120]: 2025-10-02 12:13:51.388772501 +0000 UTC m=+0.086091206 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Oct  2 08:13:51 np0005466012 podman[229122]: 2025-10-02 12:13:51.397981656 +0000 UTC m=+0.095589768 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.418 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[46a77a2a-90d1-4875-929d-5ba12be9b949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 NetworkManager[51207]: <info>  [1759407231.4234] manager: (tap4f195445-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Oct  2 08:13:51 np0005466012 systemd-udevd[229157]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.424 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d043da35-5fa1-49be-8742-bf52b2d64501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.458 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[404a5c1b-7b22-446a-9a27-6b259504bf15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.461 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[24719ff5-e032-475b-8aed-317ced62194e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 NetworkManager[51207]: <info>  [1759407231.4854] device (tap4f195445-f0): carrier: link connected
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.491 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[979454ff-7cb9-443b-8fb8-0fb5da5d339a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.509 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b699771c-4481-4fa7-893f-cbcc01296b5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f195445-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522509, 'reachable_time': 22699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229200, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.531 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ba91ff16-eb25-47a7-a1ac-ec38162887b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:9303'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522509, 'tstamp': 522509}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229201, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.560 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4c84c49a-3a51-4c75-8a8a-6a0ec3bb72cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f195445-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522509, 'reachable_time': 22699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229202, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.600 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[499fae53-e06c-4523-8131-e2985ebaac00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.670 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a64397cf-8866-413b-aeb3-b80963e201d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.671 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f195445-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.672 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.672 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f195445-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 NetworkManager[51207]: <info>  [1759407231.6754] manager: (tap4f195445-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Oct  2 08:13:51 np0005466012 kernel: tap4f195445-f0: entered promiscuous mode
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.679 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f195445-f0, col_values=(('external_ids', {'iface-id': 'd65a1bd0-87e2-4bbf-9945-dacace78444f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:51Z|00253|binding|INFO|Releasing lport d65a1bd0-87e2-4bbf-9945-dacace78444f from this chassis (sb_readonly=0)
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.683 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.683 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0eae48-0533-421e-8be4-8e3a7fd50cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.684 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.pid.haproxy
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:13:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:51.685 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'env', 'PROCESS_TAG=haproxy-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f195445-fd43-4b92-89dd-a1b2fe9ea8c2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.688 2 DEBUG nova.compute.manager [req-7d869483-63b0-4063-ba12-30085fd176a5 req-ecc0c82c-bddf-49e0-8013-277dae092514 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.688 2 DEBUG oslo_concurrency.lockutils [req-7d869483-63b0-4063-ba12-30085fd176a5 req-ecc0c82c-bddf-49e0-8013-277dae092514 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.689 2 DEBUG oslo_concurrency.lockutils [req-7d869483-63b0-4063-ba12-30085fd176a5 req-ecc0c82c-bddf-49e0-8013-277dae092514 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.689 2 DEBUG oslo_concurrency.lockutils [req-7d869483-63b0-4063-ba12-30085fd176a5 req-ecc0c82c-bddf-49e0-8013-277dae092514 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.689 2 DEBUG nova.compute.manager [req-7d869483-63b0-4063-ba12-30085fd176a5 req-ecc0c82c-bddf-49e0-8013-277dae092514 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Processing event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:13:51 np0005466012 nova_compute[192063]: 2025-10-02 12:13:51.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.026 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407217.025898, 8a3a4249-c00f-498a-8a14-6b0862928cb5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.027 2 INFO nova.compute.manager [-] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.047 2 DEBUG nova.compute.manager [None req-9f9651cf-8a84-4adb-a32e-2230470e141c - - - - - -] [instance: 8a3a4249-c00f-498a-8a14-6b0862928cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:52 np0005466012 podman[229242]: 2025-10-02 12:13:52.038979182 +0000 UTC m=+0.024347663 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.199 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.200 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407232.2001486, fd1711af-5e98-4ad5-a746-8aab0f033256 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.201 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] VM Started (Lifecycle Event)#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.206 2 DEBUG nova.virt.libvirt.driver [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.210 2 INFO nova.virt.libvirt.driver [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Instance spawned successfully.#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.211 2 INFO nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Took 7.07 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.212 2 DEBUG nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.248 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.251 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.278 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.279 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407232.2008996, fd1711af-5e98-4ad5-a746-8aab0f033256 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.279 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.320 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.323 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407232.2040904, fd1711af-5e98-4ad5-a746-8aab0f033256 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.323 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.361 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.364 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.366 2 INFO nova.compute.manager [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Took 7.65 seconds to build instance.#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.416 2 DEBUG oslo_concurrency.lockutils [None req-bf69ddfb-a787-4d7c-bf1b-eb317131ca64 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.479 2 DEBUG nova.network.neutron [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Updated VIF entry in instance network info cache for port cedd012a-9c66-4c4e-abaf-616ff8e2a95c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.480 2 DEBUG nova.network.neutron [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Updating instance_info_cache with network_info: [{"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:52 np0005466012 nova_compute[192063]: 2025-10-02 12:13:52.506 2 DEBUG oslo_concurrency.lockutils [req-93c73bd3-8151-46f3-ba8a-ae20c6ef1896 req-36671fe1-50ea-41fd-b76a-fc5648017a60 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fd1711af-5e98-4ad5-a746-8aab0f033256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:13:52 np0005466012 podman[229242]: 2025-10-02 12:13:52.719357335 +0000 UTC m=+0.704725796 container create 230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:13:53 np0005466012 systemd[1]: Started libpod-conmon-230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57.scope.
Oct  2 08:13:53 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:13:53 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07af3a129af587c1ec6a9e9905597537d39f70fa419b6c0986b48282f78f8da1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:13:53 np0005466012 podman[229242]: 2025-10-02 12:13:53.597957528 +0000 UTC m=+1.583326039 container init 230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:13:53 np0005466012 podman[229242]: 2025-10-02 12:13:53.605180857 +0000 UTC m=+1.590549338 container start 230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:13:53 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [NOTICE]   (229261) : New worker (229263) forked
Oct  2 08:13:53 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [NOTICE]   (229261) : Loading success.
Oct  2 08:13:53 np0005466012 nova_compute[192063]: 2025-10-02 12:13:53.852 2 DEBUG nova.compute.manager [req-5ce1c38c-be4f-4fbf-9539-9d7364561a1d req-16f48daa-3dd7-4dd6-83ab-31419419d00d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:53 np0005466012 nova_compute[192063]: 2025-10-02 12:13:53.853 2 DEBUG oslo_concurrency.lockutils [req-5ce1c38c-be4f-4fbf-9539-9d7364561a1d req-16f48daa-3dd7-4dd6-83ab-31419419d00d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:53 np0005466012 nova_compute[192063]: 2025-10-02 12:13:53.853 2 DEBUG oslo_concurrency.lockutils [req-5ce1c38c-be4f-4fbf-9539-9d7364561a1d req-16f48daa-3dd7-4dd6-83ab-31419419d00d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:53 np0005466012 nova_compute[192063]: 2025-10-02 12:13:53.854 2 DEBUG oslo_concurrency.lockutils [req-5ce1c38c-be4f-4fbf-9539-9d7364561a1d req-16f48daa-3dd7-4dd6-83ab-31419419d00d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:53 np0005466012 nova_compute[192063]: 2025-10-02 12:13:53.854 2 DEBUG nova.compute.manager [req-5ce1c38c-be4f-4fbf-9539-9d7364561a1d req-16f48daa-3dd7-4dd6-83ab-31419419d00d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] No waiting events found dispatching network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:53 np0005466012 nova_compute[192063]: 2025-10-02 12:13:53.854 2 WARNING nova.compute.manager [req-5ce1c38c-be4f-4fbf-9539-9d7364561a1d req-16f48daa-3dd7-4dd6-83ab-31419419d00d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received unexpected event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.373 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.374 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.374 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.374 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.375 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.384 2 INFO nova.compute.manager [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Terminating instance#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.394 2 DEBUG nova.compute.manager [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:13:54 np0005466012 kernel: tapcedd012a-9c (unregistering): left promiscuous mode
Oct  2 08:13:54 np0005466012 NetworkManager[51207]: <info>  [1759407234.4146] device (tapcedd012a-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:13:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:54Z|00254|binding|INFO|Releasing lport cedd012a-9c66-4c4e-abaf-616ff8e2a95c from this chassis (sb_readonly=0)
Oct  2 08:13:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:54Z|00255|binding|INFO|Setting lport cedd012a-9c66-4c4e-abaf-616ff8e2a95c down in Southbound
Oct  2 08:13:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:13:54Z|00256|binding|INFO|Removing iface tapcedd012a-9c ovn-installed in OVS
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:54.439 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:7b:79 10.100.0.10'], port_security=['fa:16:3e:2a:7b:79 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fd1711af-5e98-4ad5-a746-8aab0f033256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dcf78460093d411988a54040ea4c265a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aacce687-8b76-4e90-b19c-0dd006394188', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24ae9888-31f5-4083-b5ee-e7ed6a1eee13, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=cedd012a-9c66-4c4e-abaf-616ff8e2a95c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:13:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:54.441 103246 INFO neutron.agent.ovn.metadata.agent [-] Port cedd012a-9c66-4c4e-abaf-616ff8e2a95c in datapath 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 unbound from our chassis#033[00m
Oct  2 08:13:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:54.442 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:54.444 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[346f6058-524e-423a-bd6f-2c7ce3355aff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:54.444 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 namespace which is not needed anymore#033[00m
Oct  2 08:13:54 np0005466012 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000045.scope: Deactivated successfully.
Oct  2 08:13:54 np0005466012 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000045.scope: Consumed 2.868s CPU time.
Oct  2 08:13:54 np0005466012 systemd-machined[152114]: Machine qemu-29-instance-00000045 terminated.
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.665 2 INFO nova.virt.libvirt.driver [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Instance destroyed successfully.#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.668 2 DEBUG nova.objects.instance [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lazy-loading 'resources' on Instance uuid fd1711af-5e98-4ad5-a746-8aab0f033256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.698 2 DEBUG nova.virt.libvirt.vif [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:13:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-907012627',display_name='tempest-ImagesTestJSON-server-907012627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-907012627',id=69,image_ref='3c689ccb-5eb5-4436-85d7-552ebca0cc4a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:13:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dcf78460093d411988a54040ea4c265a',ramdisk_id='',reservation_id='r-sdyek3fe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='3277cbd6-2706-4647-b0df-b789c49f80ea',image_min_disk='1',image_min_ram='0',image_owner_id='dcf78460093d411988a54040ea4c265a',image_owner_project_name='tempest-ImagesTestJSON-437970487',image_owner_user_name='tempest-ImagesTestJSON-437970487-project-member',image_user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',owner_project_name='tempest-ImagesTestJSON-437970487',owner_user_name='tempest-ImagesTestJSON-437970487-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:13:52Z,user_data=None,user_id='dcdfc3c0f94e42cb931d27f2e3b5b12d',uuid=fd1711af-5e98-4ad5-a746-8aab0f033256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.699 2 DEBUG nova.network.os_vif_util [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converting VIF {"id": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "address": "fa:16:3e:2a:7b:79", "network": {"id": "4f195445-fd43-4b92-89dd-a1b2fe9ea8c2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-793597453-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dcf78460093d411988a54040ea4c265a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcedd012a-9c", "ovs_interfaceid": "cedd012a-9c66-4c4e-abaf-616ff8e2a95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.700 2 DEBUG nova.network.os_vif_util [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.700 2 DEBUG os_vif [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.702 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcedd012a-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.706 2 INFO os_vif [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:7b:79,bridge_name='br-int',has_traffic_filtering=True,id=cedd012a-9c66-4c4e-abaf-616ff8e2a95c,network=Network(4f195445-fd43-4b92-89dd-a1b2fe9ea8c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcedd012a-9c')#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.706 2 INFO nova.virt.libvirt.driver [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Deleting instance files /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256_del#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.707 2 INFO nova.virt.libvirt.driver [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Deletion of /var/lib/nova/instances/fd1711af-5e98-4ad5-a746-8aab0f033256_del complete#033[00m
Oct  2 08:13:54 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [NOTICE]   (229261) : haproxy version is 2.8.14-c23fe91
Oct  2 08:13:54 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [NOTICE]   (229261) : path to executable is /usr/sbin/haproxy
Oct  2 08:13:54 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [WARNING]  (229261) : Exiting Master process...
Oct  2 08:13:54 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [WARNING]  (229261) : Exiting Master process...
Oct  2 08:13:54 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [ALERT]    (229261) : Current worker (229263) exited with code 143 (Terminated)
Oct  2 08:13:54 np0005466012 neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2[229257]: [WARNING]  (229261) : All workers exited. Exiting... (0)
Oct  2 08:13:54 np0005466012 systemd[1]: libpod-230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57.scope: Deactivated successfully.
Oct  2 08:13:54 np0005466012 podman[229295]: 2025-10-02 12:13:54.778634214 +0000 UTC m=+0.241219046 container died 230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.795 2 INFO nova.compute.manager [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.796 2 DEBUG oslo.service.loopingcall [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.796 2 DEBUG nova.compute.manager [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:13:54 np0005466012 nova_compute[192063]: 2025-10-02 12:13:54.796 2 DEBUG nova.network.neutron [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:13:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57-userdata-shm.mount: Deactivated successfully.
Oct  2 08:13:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay-07af3a129af587c1ec6a9e9905597537d39f70fa419b6c0986b48282f78f8da1-merged.mount: Deactivated successfully.
Oct  2 08:13:55 np0005466012 podman[229339]: 2025-10-02 12:13:55.355789439 +0000 UTC m=+0.053756024 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct  2 08:13:55 np0005466012 podman[229340]: 2025-10-02 12:13:55.356090378 +0000 UTC m=+0.050582777 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:13:55 np0005466012 nova_compute[192063]: 2025-10-02 12:13:55.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:55 np0005466012 podman[229295]: 2025-10-02 12:13:55.799821742 +0000 UTC m=+1.262406584 container cleanup 230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:13:55 np0005466012 systemd[1]: libpod-conmon-230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57.scope: Deactivated successfully.
Oct  2 08:13:55 np0005466012 nova_compute[192063]: 2025-10-02 12:13:55.860 2 DEBUG nova.network.neutron [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:13:55 np0005466012 nova_compute[192063]: 2025-10-02 12:13:55.880 2 INFO nova.compute.manager [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Took 1.08 seconds to deallocate network for instance.#033[00m
Oct  2 08:13:55 np0005466012 nova_compute[192063]: 2025-10-02 12:13:55.990 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:55 np0005466012 nova_compute[192063]: 2025-10-02 12:13:55.991 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.014 2 DEBUG nova.compute.manager [req-bc66a1f3-f9c5-4a06-95fb-f4f75e9a7ac9 req-6e7b1101-8e05-4e9a-8ef8-69f14c835a5e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received event network-vif-deleted-cedd012a-9c66-4c4e-abaf-616ff8e2a95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.019 2 DEBUG nova.compute.manager [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received event network-vif-unplugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.020 2 DEBUG oslo_concurrency.lockutils [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.020 2 DEBUG oslo_concurrency.lockutils [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.020 2 DEBUG oslo_concurrency.lockutils [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.020 2 DEBUG nova.compute.manager [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] No waiting events found dispatching network-vif-unplugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.021 2 WARNING nova.compute.manager [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received unexpected event network-vif-unplugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.021 2 DEBUG nova.compute.manager [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.021 2 DEBUG oslo_concurrency.lockutils [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.021 2 DEBUG oslo_concurrency.lockutils [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.021 2 DEBUG oslo_concurrency.lockutils [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.022 2 DEBUG nova.compute.manager [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] No waiting events found dispatching network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.022 2 WARNING nova.compute.manager [req-76933470-24a9-4565-b248-ab9aa7bf97e8 req-ed8c609e-ef48-40cd-800d-9402af0e63a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Received unexpected event network-vif-plugged-cedd012a-9c66-4c4e-abaf-616ff8e2a95c for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.088 2 DEBUG nova.compute.provider_tree [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.103 2 DEBUG nova.scheduler.client.report [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.131 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.162 2 INFO nova.scheduler.client.report [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Deleted allocations for instance fd1711af-5e98-4ad5-a746-8aab0f033256#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.302 2 DEBUG oslo_concurrency.lockutils [None req-e038e960-2815-4361-ab1c-5d41cd6f5b18 dcdfc3c0f94e42cb931d27f2e3b5b12d dcf78460093d411988a54040ea4c265a - - default default] Lock "fd1711af-5e98-4ad5-a746-8aab0f033256" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:13:56 np0005466012 podman[229387]: 2025-10-02 12:13:56.591612409 +0000 UTC m=+0.769233816 container remove 230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.598 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[77be98ee-2b4c-498a-9003-d5af3545de69]: (4, ('Thu Oct  2 12:13:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 (230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57)\n230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57\nThu Oct  2 12:13:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 (230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57)\n230e045729901de9644343be56110a0e94d2bc6b3fb4c2fe6e966191a5b46d57\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.600 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4e0f06-89fa-4f4e-88a1-bd70444888bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.601 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f195445-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:13:56 np0005466012 kernel: tap4f195445-f0: left promiscuous mode
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:56 np0005466012 nova_compute[192063]: 2025-10-02 12:13:56.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.618 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[643d894d-b367-4b37-a55c-60bfed898aed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.653 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba7333e-38fb-42c7-9a63-03ba690c47f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.654 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb056734-217b-462c-8b6e-2945e4ad88d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.671 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08a36fda-ef04-4a65-adea-822610384ae1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522502, 'reachable_time': 41801, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229402, 'error': None, 'target': 'ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.673 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f195445-fd43-4b92-89dd-a1b2fe9ea8c2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:13:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:13:56.673 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1312aa-ba69-44fa-a823-139d5bf36da5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:13:56 np0005466012 systemd[1]: run-netns-ovnmeta\x2d4f195445\x2dfd43\x2d4b92\x2d89dd\x2da1b2fe9ea8c2.mount: Deactivated successfully.
Oct  2 08:13:59 np0005466012 nova_compute[192063]: 2025-10-02 12:13:59.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:00 np0005466012 nova_compute[192063]: 2025-10-02 12:14:00.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:02.125 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:02.126 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:02.126 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:04 np0005466012 nova_compute[192063]: 2025-10-02 12:14:04.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:05 np0005466012 nova_compute[192063]: 2025-10-02 12:14:05.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:09 np0005466012 podman[229403]: 2025-10-02 12:14:09.133610249 +0000 UTC m=+0.045606370 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:14:09 np0005466012 podman[229404]: 2025-10-02 12:14:09.173779727 +0000 UTC m=+0.083892186 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.662 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407234.6620636, fd1711af-5e98-4ad5-a746-8aab0f033256 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.663 2 INFO nova.compute.manager [-] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.689 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.690 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.693 2 DEBUG nova.compute.manager [None req-858a3d66-5859-4e8b-a023-575a9bba1baa - - - - - -] [instance: fd1711af-5e98-4ad5-a746-8aab0f033256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.704 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.896 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.896 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.906 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:14:09 np0005466012 nova_compute[192063]: 2025-10-02 12:14:09.907 2 INFO nova.compute.claims [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.069 2 DEBUG nova.compute.provider_tree [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.103 2 DEBUG nova.scheduler.client.report [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.158 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.159 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.239 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.239 2 DEBUG nova.network.neutron [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.271 2 INFO nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.320 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.520 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.522 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.522 2 INFO nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Creating image(s)#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.523 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.523 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.524 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.537 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.577 2 DEBUG nova.policy [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'def48c13fd6a43ba88836b753986a731', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffae703d68b24b9c89686c149113fc2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.598 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.599 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.599 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.610 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.665 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.666 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.816 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk 1073741824" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.817 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.817 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.881 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.882 2 DEBUG nova.virt.disk.api [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.882 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.936 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.937 2 DEBUG nova.virt.disk.api [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.938 2 DEBUG nova.objects.instance [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 6e45ea08-64c1-4434-9d80-94d4b7cec844 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.956 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.957 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Ensure instance console log exists: /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.957 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.958 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:10 np0005466012 nova_compute[192063]: 2025-10-02 12:14:10.958 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:13 np0005466012 podman[229467]: 2025-10-02 12:14:13.134131032 +0000 UTC m=+0.051180164 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:14:14 np0005466012 nova_compute[192063]: 2025-10-02 12:14:14.253 2 DEBUG nova.network.neutron [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Successfully created port: b1b379f4-7eb3-40e5-8edd-d903c05484af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:14:14 np0005466012 nova_compute[192063]: 2025-10-02 12:14:14.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:15 np0005466012 nova_compute[192063]: 2025-10-02 12:14:15.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:15 np0005466012 nova_compute[192063]: 2025-10-02 12:14:15.834 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:16 np0005466012 podman[229487]: 2025-10-02 12:14:16.138224731 +0000 UTC m=+0.056409357 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.189 2 DEBUG nova.network.neutron [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Successfully updated port: b1b379f4-7eb3-40e5-8edd-d903c05484af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.210 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.210 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.210 2 DEBUG nova.network.neutron [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.413 2 DEBUG nova.compute.manager [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-changed-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.413 2 DEBUG nova.compute.manager [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Refreshing instance network info cache due to event network-changed-b1b379f4-7eb3-40e5-8edd-d903c05484af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.414 2 DEBUG oslo_concurrency.lockutils [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:16 np0005466012 nova_compute[192063]: 2025-10-02 12:14:16.427 2 DEBUG nova.network.neutron [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:14:17 np0005466012 nova_compute[192063]: 2025-10-02 12:14:17.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:17 np0005466012 nova_compute[192063]: 2025-10-02 12:14:17.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.774 2 DEBUG nova.network.neutron [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updating instance_info_cache with network_info: [{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.807 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.807 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance network_info: |[{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.808 2 DEBUG oslo_concurrency.lockutils [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.808 2 DEBUG nova.network.neutron [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Refreshing network info cache for port b1b379f4-7eb3-40e5-8edd-d903c05484af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.811 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Start _get_guest_xml network_info=[{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.815 2 WARNING nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.819 2 DEBUG nova.virt.libvirt.host [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.820 2 DEBUG nova.virt.libvirt.host [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.823 2 DEBUG nova.virt.libvirt.host [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.823 2 DEBUG nova.virt.libvirt.host [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.824 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.825 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.825 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.825 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.826 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.826 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.826 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.826 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.827 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.827 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.827 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.828 2 DEBUG nova.virt.hardware [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.832 2 DEBUG nova.virt.libvirt.vif [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-922274791',display_name='tempest-ServerDiskConfigTestJSON-server-922274791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-922274791',id=72,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-7a20jfzw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:10Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=6e45ea08-64c1-4434-9d80-94d4b7cec844,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.832 2 DEBUG nova.network.os_vif_util [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.833 2 DEBUG nova.network.os_vif_util [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.834 2 DEBUG nova.objects.instance [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e45ea08-64c1-4434-9d80-94d4b7cec844 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.922 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <uuid>6e45ea08-64c1-4434-9d80-94d4b7cec844</uuid>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <name>instance-00000048</name>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-922274791</nova:name>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:14:18</nova:creationTime>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        <nova:port uuid="b1b379f4-7eb3-40e5-8edd-d903c05484af">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <entry name="serial">6e45ea08-64c1-4434-9d80-94d4b7cec844</entry>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <entry name="uuid">6e45ea08-64c1-4434-9d80-94d4b7cec844</entry>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:5d:22:98"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <target dev="tapb1b379f4-7e"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/console.log" append="off"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:14:18 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:14:18 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:14:18 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:14:18 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.924 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Preparing to wait for external event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.925 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.925 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.925 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.926 2 DEBUG nova.virt.libvirt.vif [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-922274791',display_name='tempest-ServerDiskConfigTestJSON-server-922274791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-922274791',id=72,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-7a20jfzw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:14:10Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=6e45ea08-64c1-4434-9d80-94d4b7cec844,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.926 2 DEBUG nova.network.os_vif_util [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.927 2 DEBUG nova.network.os_vif_util [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.927 2 DEBUG os_vif [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.930 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.931 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.933 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b379f4-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.934 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1b379f4-7e, col_values=(('external_ids', {'iface-id': 'b1b379f4-7eb3-40e5-8edd-d903c05484af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:22:98', 'vm-uuid': '6e45ea08-64c1-4434-9d80-94d4b7cec844'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:18 np0005466012 NetworkManager[51207]: <info>  [1759407258.9362] manager: (tapb1b379f4-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:18 np0005466012 nova_compute[192063]: 2025-10-02 12:14:18.941 2 INFO os_vif [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e')#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.230 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.231 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.231 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:5d:22:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.232 2 INFO nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Using config drive#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.825 2 INFO nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Creating config drive at /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.829 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0o1_8g2_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.847 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.848 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.848 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.849 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:14:19 np0005466012 nova_compute[192063]: 2025-10-02 12:14:19.955 2 DEBUG oslo_concurrency.processutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0o1_8g2_" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:20 np0005466012 kernel: tapb1b379f4-7e: entered promiscuous mode
Oct  2 08:14:20 np0005466012 NetworkManager[51207]: <info>  [1759407260.0114] manager: (tapb1b379f4-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Oct  2 08:14:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:20Z|00257|binding|INFO|Claiming lport b1b379f4-7eb3-40e5-8edd-d903c05484af for this chassis.
Oct  2 08:14:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:20Z|00258|binding|INFO|b1b379f4-7eb3-40e5-8edd-d903c05484af: Claiming fa:16:3e:5d:22:98 10.100.0.6
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.028 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:22:98 10.100.0.6'], port_security=['fa:16:3e:5d:22:98 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6e45ea08-64c1-4434-9d80-94d4b7cec844', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=b1b379f4-7eb3-40e5-8edd-d903c05484af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.029 103246 INFO neutron.agent.ovn.metadata.agent [-] Port b1b379f4-7eb3-40e5-8edd-d903c05484af in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.030 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.043 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1179adc-98e3-4f02-a230-fc961daa421f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.044 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:14:20 np0005466012 systemd-udevd[229529]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.045 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.046 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6e9698-45ae-41ce-8bfe-1250aba989f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.046 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2ca476ce-b045-4316-a521-62b04268b63d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 NetworkManager[51207]: <info>  [1759407260.0585] device (tapb1b379f4-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:14:20 np0005466012 NetworkManager[51207]: <info>  [1759407260.0592] device (tapb1b379f4-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.062 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f07e0484-2b89-42ea-8c5c-294a9a088e5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 systemd-machined[152114]: New machine qemu-30-instance-00000048.
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:20Z|00259|binding|INFO|Setting lport b1b379f4-7eb3-40e5-8edd-d903c05484af ovn-installed in OVS
Oct  2 08:14:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:20Z|00260|binding|INFO|Setting lport b1b379f4-7eb3-40e5-8edd-d903c05484af up in Southbound
Oct  2 08:14:20 np0005466012 systemd[1]: Started Virtual Machine qemu-30-instance-00000048.
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.079 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f57859ab-314f-4b33-8567-3c313a0ebc8a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.106 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[795e8ca3-3868-47a1-ad8e-c6db8772984b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.111 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c1293a84-157d-4f15-955c-f0e24c146fe2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 NetworkManager[51207]: <info>  [1759407260.1128] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.141 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c24c1c04-f3b7-49da-89b2-ceb028f42c44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.145 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c87f3bd1-f5bd-4f1e-9ec4-e585ef67ca76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 NetworkManager[51207]: <info>  [1759407260.1763] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.179 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[eb064cba-baa1-4a44-a6dc-98c3356ffc9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.196 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5237c713-31ec-4e6b-8546-681aa6695317]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525378, 'reachable_time': 35188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229562, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.215 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5576002e-d7f7-46d0-bd13-486ddd425c34]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525378, 'tstamp': 525378}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229563, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.234 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a99055-09dc-432c-b109-c4da796a9f76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525378, 'reachable_time': 35188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229564, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.271 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dbba59e6-e2e4-4eab-9dfa-bf5e11201463]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.329 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcfcede-35ac-47cd-bda3-893dc2da013e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.331 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.331 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.331 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 NetworkManager[51207]: <info>  [1759407260.3342] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Oct  2 08:14:20 np0005466012 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.337 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:20Z|00261|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.340 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.342 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c16be078-2153-44fb-a00d-b6dd3bdf85f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.343 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:14:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:20.345 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:20 np0005466012 podman[229604]: 2025-10-02 12:14:20.730043599 +0000 UTC m=+0.030290287 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.867 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.896 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.896 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.897 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.897 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:14:20 np0005466012 nova_compute[192063]: 2025-10-02 12:14:20.970 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.030 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.031 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.089 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.118 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407261.1182487, 6e45ea08-64c1-4434-9d80-94d4b7cec844 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.119 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] VM Started (Lifecycle Event)#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.184 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.187 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407261.118708, 6e45ea08-64c1-4434-9d80-94d4b7cec844 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.187 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:14:21 np0005466012 podman[229604]: 2025-10-02 12:14:21.20124518 +0000 UTC m=+0.501491848 container create c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.205 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.209 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.230 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.242 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.243 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5665MB free_disk=73.3933219909668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.243 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.244 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:21 np0005466012 systemd[1]: Started libpod-conmon-c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592.scope.
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.339 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 6e45ea08-64c1-4434-9d80-94d4b7cec844 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.340 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.340 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:14:21 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:14:21 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b4b373822bab673f17070c382ad1e6c4a7d17ba9943d6432421301d09a89032/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.392 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.407 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.448 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.448 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.459 2 DEBUG nova.network.neutron [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updated VIF entry in instance network info cache for port b1b379f4-7eb3-40e5-8edd-d903c05484af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.460 2 DEBUG nova.network.neutron [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updating instance_info_cache with network_info: [{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:21 np0005466012 podman[229604]: 2025-10-02 12:14:21.475041895 +0000 UTC m=+0.775288593 container init c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:14:21 np0005466012 podman[229604]: 2025-10-02 12:14:21.486955393 +0000 UTC m=+0.787202061 container start c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:14:21 np0005466012 nova_compute[192063]: 2025-10-02 12:14:21.496 2 DEBUG oslo_concurrency.lockutils [req-5d926ede-82e0-406b-9377-cc475e075a4e req-8fc04eea-90e6-4996-9e90-d642675ab8ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:21 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [NOTICE]   (229630) : New worker (229632) forked
Oct  2 08:14:21 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [NOTICE]   (229630) : Loading success.
Oct  2 08:14:22 np0005466012 podman[229642]: 2025-10-02 12:14:22.159663995 +0000 UTC m=+0.066914437 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc.)
Oct  2 08:14:22 np0005466012 podman[229641]: 2025-10-02 12:14:22.168584451 +0000 UTC m=+0.061734054 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:22.820 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:22 np0005466012 nova_compute[192063]: 2025-10-02 12:14:22.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:22.822 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.429 2 DEBUG nova.compute.manager [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.430 2 DEBUG oslo_concurrency.lockutils [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.430 2 DEBUG oslo_concurrency.lockutils [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.430 2 DEBUG oslo_concurrency.lockutils [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.430 2 DEBUG nova.compute.manager [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Processing event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.430 2 DEBUG nova.compute.manager [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.430 2 DEBUG oslo_concurrency.lockutils [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.431 2 DEBUG oslo_concurrency.lockutils [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.431 2 DEBUG oslo_concurrency.lockutils [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.431 2 DEBUG nova.compute.manager [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] No waiting events found dispatching network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.431 2 WARNING nova.compute.manager [req-59c9a983-c5e7-46f3-b983-f24b556cd30c req-6d8088b6-a54e-4e6a-a44c-16b0b3f40a5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received unexpected event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.432 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.436 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407263.4357119, 6e45ea08-64c1-4434-9d80-94d4b7cec844 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.436 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.437 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.442 2 INFO nova.virt.libvirt.driver [-] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance spawned successfully.#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.443 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.457 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.460 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.472 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.472 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.473 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.473 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.473 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.473 2 DEBUG nova.virt.libvirt.driver [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.499 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.548 2 INFO nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Took 13.03 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.548 2 DEBUG nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.679 2 INFO nova.compute.manager [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Took 13.88 seconds to build instance.#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.706 2 DEBUG oslo_concurrency.lockutils [None req-ee067ae6-d6c8-4750-a702-24975d3cc3e5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:14:23 np0005466012 nova_compute[192063]: 2025-10-02 12:14:23.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:25 np0005466012 nova_compute[192063]: 2025-10-02 12:14:25.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:25.825 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:26 np0005466012 podman[229681]: 2025-10-02 12:14:26.159145759 +0000 UTC m=+0.071853733 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:14:26 np0005466012 podman[229680]: 2025-10-02 12:14:26.172749934 +0000 UTC m=+0.081234322 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:14:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:28Z|00262|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:14:28 np0005466012 nova_compute[192063]: 2025-10-02 12:14:28.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:28 np0005466012 nova_compute[192063]: 2025-10-02 12:14:28.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.126 2 DEBUG oslo_concurrency.lockutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.127 2 DEBUG oslo_concurrency.lockutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.127 2 DEBUG nova.network.neutron [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.842 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:14:29 np0005466012 nova_compute[192063]: 2025-10-02 12:14:29.862 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:30 np0005466012 nova_compute[192063]: 2025-10-02 12:14:30.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.286 2 DEBUG nova.network.neutron [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updating instance_info_cache with network_info: [{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.310 2 DEBUG oslo_concurrency.lockutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.312 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.312 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.312 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6e45ea08-64c1-4434-9d80-94d4b7cec844 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.443 2 DEBUG nova.virt.libvirt.driver [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.444 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Creating file /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/ebcdb3f483804e9db36b784244911bba.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.444 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/ebcdb3f483804e9db36b784244911bba.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.846 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/ebcdb3f483804e9db36b784244911bba.tmp" returned: 1 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.847 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/ebcdb3f483804e9db36b784244911bba.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.847 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Creating directory /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:14:31 np0005466012 nova_compute[192063]: 2025-10-02 12:14:31.847 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:32 np0005466012 nova_compute[192063]: 2025-10-02 12:14:32.056 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:32 np0005466012 nova_compute[192063]: 2025-10-02 12:14:32.060 2 DEBUG nova.virt.libvirt.driver [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:14:33 np0005466012 nova_compute[192063]: 2025-10-02 12:14:33.134 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updating instance_info_cache with network_info: [{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:14:33 np0005466012 nova_compute[192063]: 2025-10-02 12:14:33.151 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:14:33 np0005466012 nova_compute[192063]: 2025-10-02 12:14:33.152 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:14:33 np0005466012 nova_compute[192063]: 2025-10-02 12:14:33.152 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:33 np0005466012 nova_compute[192063]: 2025-10-02 12:14:33.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:35 np0005466012 nova_compute[192063]: 2025-10-02 12:14:35.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:36 np0005466012 nova_compute[192063]: 2025-10-02 12:14:36.380 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:14:36 np0005466012 nova_compute[192063]: 2025-10-02 12:14:36.401 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Triggering sync for uuid 6e45ea08-64c1-4434-9d80-94d4b7cec844 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:14:36 np0005466012 nova_compute[192063]: 2025-10-02 12:14:36.402 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:36 np0005466012 nova_compute[192063]: 2025-10-02 12:14:36.402 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:36 np0005466012 nova_compute[192063]: 2025-10-02 12:14:36.402 2 INFO nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] During sync_power_state the instance has a pending task (resize_migrating). Skip.#033[00m
Oct  2 08:14:36 np0005466012 nova_compute[192063]: 2025-10-02 12:14:36.403 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:38 np0005466012 nova_compute[192063]: 2025-10-02 12:14:38.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:40 np0005466012 podman[229737]: 2025-10-02 12:14:40.143880986 +0000 UTC m=+0.061676333 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:14:40 np0005466012 podman[229738]: 2025-10-02 12:14:40.164783613 +0000 UTC m=+0.081585782 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:40 np0005466012 nova_compute[192063]: 2025-10-02 12:14:40.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:41Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5d:22:98 10.100.0.6
Oct  2 08:14:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:41Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5d:22:98 10.100.0.6
Oct  2 08:14:42 np0005466012 nova_compute[192063]: 2025-10-02 12:14:42.103 2 DEBUG nova.virt.libvirt.driver [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:14:43 np0005466012 nova_compute[192063]: 2025-10-02 12:14:43.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:44 np0005466012 podman[229797]: 2025-10-02 12:14:44.150375381 +0000 UTC m=+0.066342160 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:14:45 np0005466012 nova_compute[192063]: 2025-10-02 12:14:45.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 kernel: tapb1b379f4-7e (unregistering): left promiscuous mode
Oct  2 08:14:47 np0005466012 NetworkManager[51207]: <info>  [1759407287.0725] device (tapb1b379f4-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:14:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:47Z|00263|binding|INFO|Releasing lport b1b379f4-7eb3-40e5-8edd-d903c05484af from this chassis (sb_readonly=0)
Oct  2 08:14:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:47Z|00264|binding|INFO|Setting lport b1b379f4-7eb3-40e5-8edd-d903c05484af down in Southbound
Oct  2 08:14:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:14:47Z|00265|binding|INFO|Removing iface tapb1b379f4-7e ovn-installed in OVS
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:47.090 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:22:98 10.100.0.6'], port_security=['fa:16:3e:5d:22:98 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6e45ea08-64c1-4434-9d80-94d4b7cec844', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=b1b379f4-7eb3-40e5-8edd-d903c05484af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:14:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:47.091 103246 INFO neutron.agent.ovn.metadata.agent [-] Port b1b379f4-7eb3-40e5-8edd-d903c05484af in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:14:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:47.092 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:14:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:47.094 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5e0730-5ef2-4df3-ba30-33b373875a4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:47.094 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000048.scope: Deactivated successfully.
Oct  2 08:14:47 np0005466012 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000048.scope: Consumed 15.425s CPU time.
Oct  2 08:14:47 np0005466012 systemd-machined[152114]: Machine qemu-30-instance-00000048 terminated.
Oct  2 08:14:47 np0005466012 podman[229816]: 2025-10-02 12:14:47.158601013 +0000 UTC m=+0.066409323 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:14:47 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [NOTICE]   (229630) : haproxy version is 2.8.14-c23fe91
Oct  2 08:14:47 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [NOTICE]   (229630) : path to executable is /usr/sbin/haproxy
Oct  2 08:14:47 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [WARNING]  (229630) : Exiting Master process...
Oct  2 08:14:47 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [ALERT]    (229630) : Current worker (229632) exited with code 143 (Terminated)
Oct  2 08:14:47 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[229626]: [WARNING]  (229630) : All workers exited. Exiting... (0)
Oct  2 08:14:47 np0005466012 systemd[1]: libpod-c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592.scope: Deactivated successfully.
Oct  2 08:14:47 np0005466012 podman[229860]: 2025-10-02 12:14:47.272818695 +0000 UTC m=+0.094810938 container died c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.371 2 INFO nova.virt.libvirt.driver [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance shutdown successfully after 15 seconds.#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.380 2 INFO nova.virt.libvirt.driver [-] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Instance destroyed successfully.#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.381 2 DEBUG nova.virt.libvirt.vif [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-922274791',display_name='tempest-ServerDiskConfigTestJSON-server-922274791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-922274791',id=72,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-7a20jfzw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:28Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=6e45ea08-64c1-4434-9d80-94d4b7cec844,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "vif_mac": "fa:16:3e:5d:22:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.381 2 DEBUG nova.network.os_vif_util [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "vif_mac": "fa:16:3e:5d:22:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.382 2 DEBUG nova.network.os_vif_util [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.383 2 DEBUG os_vif [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.386 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b379f4-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.393 2 INFO os_vif [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e')#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.398 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:47 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592-userdata-shm.mount: Deactivated successfully.
Oct  2 08:14:47 np0005466012 systemd[1]: var-lib-containers-storage-overlay-7b4b373822bab673f17070c382ad1e6c4a7d17ba9943d6432421301d09a89032-merged.mount: Deactivated successfully.
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.469 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.470 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:47 np0005466012 podman[229860]: 2025-10-02 12:14:47.475154897 +0000 UTC m=+0.297147120 container cleanup c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:14:47 np0005466012 systemd[1]: libpod-conmon-c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592.scope: Deactivated successfully.
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.532 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.534 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Copying file /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk to 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:14:47 np0005466012 nova_compute[192063]: 2025-10-02 12:14:47.535 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:48 np0005466012 podman[229907]: 2025-10-02 12:14:48.270773639 +0000 UTC m=+0.766955091 container remove c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.280 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0a603be0-a1be-42a8-b4e4-9d714faa5b87]: (4, ('Thu Oct  2 12:14:47 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592)\nc773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592\nThu Oct  2 12:14:47 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (c773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592)\nc773b84215c836790f9b99ed31b7921c59c9755a2e9dd59056d7ed16815e5592\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.282 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[302b58b3-ff6f-42bf-8579-fcc8cf1cccaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.283 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:14:48 np0005466012 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:14:48 np0005466012 nova_compute[192063]: 2025-10-02 12:14:48.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:48 np0005466012 nova_compute[192063]: 2025-10-02 12:14:48.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:48 np0005466012 nova_compute[192063]: 2025-10-02 12:14:48.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.302 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[63272d5f-cb8e-4d5d-895f-50830dee3921]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.332 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5158c85f-9aa1-4f2a-a7e8-99486116dfde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.333 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[19abde5a-ae46-426e-84dc-2ecfa888c305]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.352 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f374d5b5-1b8c-4983-b198-d0c638f2615c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525371, 'reachable_time': 19395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229929, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.354 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:14:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:14:48.354 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5c86e1-02a0-4b6a-a649-28be68bd9400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:14:48 np0005466012 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:14:48 np0005466012 nova_compute[192063]: 2025-10-02 12:14:48.797 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "scp -r /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk" returned: 0 in 1.263s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:48 np0005466012 nova_compute[192063]: 2025-10-02 12:14:48.798 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Copying file /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:14:48 np0005466012 nova_compute[192063]: 2025-10-02 12:14:48.798 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk.config 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:49 np0005466012 nova_compute[192063]: 2025-10-02 12:14:49.057 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "scp -C -r /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk.config 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.config" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:49 np0005466012 nova_compute[192063]: 2025-10-02 12:14:49.059 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Copying file /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:14:49 np0005466012 nova_compute[192063]: 2025-10-02 12:14:49.059 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk.info 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:14:49 np0005466012 nova_compute[192063]: 2025-10-02 12:14:49.298 2 DEBUG oslo_concurrency.processutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "scp -C -r /var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844_resize/disk.info 192.168.122.100:/var/lib/nova/instances/6e45ea08-64c1-4434-9d80-94d4b7cec844/disk.info" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:14:49 np0005466012 nova_compute[192063]: 2025-10-02 12:14:49.934 2 DEBUG neutronclient.v2_0.client [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b1b379f4-7eb3-40e5-8edd-d903c05484af for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.052 2 DEBUG nova.compute.manager [req-ddfb7129-54aa-4ce0-a679-195ff6ea142d req-03faada7-231b-43f3-8050-d2e1cb031e57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-vif-unplugged-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.053 2 DEBUG oslo_concurrency.lockutils [req-ddfb7129-54aa-4ce0-a679-195ff6ea142d req-03faada7-231b-43f3-8050-d2e1cb031e57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.053 2 DEBUG oslo_concurrency.lockutils [req-ddfb7129-54aa-4ce0-a679-195ff6ea142d req-03faada7-231b-43f3-8050-d2e1cb031e57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.054 2 DEBUG oslo_concurrency.lockutils [req-ddfb7129-54aa-4ce0-a679-195ff6ea142d req-03faada7-231b-43f3-8050-d2e1cb031e57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.054 2 DEBUG nova.compute.manager [req-ddfb7129-54aa-4ce0-a679-195ff6ea142d req-03faada7-231b-43f3-8050-d2e1cb031e57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] No waiting events found dispatching network-vif-unplugged-b1b379f4-7eb3-40e5-8edd-d903c05484af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.054 2 WARNING nova.compute.manager [req-ddfb7129-54aa-4ce0-a679-195ff6ea142d req-03faada7-231b-43f3-8050-d2e1cb031e57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received unexpected event network-vif-unplugged-b1b379f4-7eb3-40e5-8edd-d903c05484af for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.086 2 DEBUG oslo_concurrency.lockutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.087 2 DEBUG oslo_concurrency.lockutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.087 2 DEBUG oslo_concurrency.lockutils [None req-46b8b047-c493-43bc-8208-2102778793db def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:50 np0005466012 nova_compute[192063]: 2025-10-02 12:14:50.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:52 np0005466012 nova_compute[192063]: 2025-10-02 12:14:52.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:53 np0005466012 nova_compute[192063]: 2025-10-02 12:14:53.081 2 DEBUG nova.compute.manager [req-caf6901b-6312-46b4-a2cb-f4564773e559 req-05f8774d-e807-448e-860f-65e4829b7f57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:53 np0005466012 nova_compute[192063]: 2025-10-02 12:14:53.081 2 DEBUG oslo_concurrency.lockutils [req-caf6901b-6312-46b4-a2cb-f4564773e559 req-05f8774d-e807-448e-860f-65e4829b7f57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:14:53 np0005466012 nova_compute[192063]: 2025-10-02 12:14:53.082 2 DEBUG oslo_concurrency.lockutils [req-caf6901b-6312-46b4-a2cb-f4564773e559 req-05f8774d-e807-448e-860f-65e4829b7f57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:14:53 np0005466012 nova_compute[192063]: 2025-10-02 12:14:53.082 2 DEBUG oslo_concurrency.lockutils [req-caf6901b-6312-46b4-a2cb-f4564773e559 req-05f8774d-e807-448e-860f-65e4829b7f57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:14:53 np0005466012 nova_compute[192063]: 2025-10-02 12:14:53.082 2 DEBUG nova.compute.manager [req-caf6901b-6312-46b4-a2cb-f4564773e559 req-05f8774d-e807-448e-860f-65e4829b7f57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] No waiting events found dispatching network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:14:53 np0005466012 nova_compute[192063]: 2025-10-02 12:14:53.083 2 WARNING nova.compute.manager [req-caf6901b-6312-46b4-a2cb-f4564773e559 req-05f8774d-e807-448e-860f-65e4829b7f57 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received unexpected event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:14:53 np0005466012 podman[229934]: 2025-10-02 12:14:53.147735734 +0000 UTC m=+0.064439539 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.build-date=20251001)
Oct  2 08:14:53 np0005466012 podman[229935]: 2025-10-02 12:14:53.147452526 +0000 UTC m=+0.064158501 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64)
Oct  2 08:14:55 np0005466012 nova_compute[192063]: 2025-10-02 12:14:55.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:14:57 np0005466012 podman[229974]: 2025-10-02 12:14:57.165734929 +0000 UTC m=+0.073130660 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:14:57 np0005466012 podman[229973]: 2025-10-02 12:14:57.168769162 +0000 UTC m=+0.084018349 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:14:57 np0005466012 nova_compute[192063]: 2025-10-02 12:14:57.185 2 DEBUG nova.compute.manager [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-changed-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:14:57 np0005466012 nova_compute[192063]: 2025-10-02 12:14:57.186 2 DEBUG nova.compute.manager [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Refreshing instance network info cache due to event network-changed-b1b379f4-7eb3-40e5-8edd-d903c05484af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:14:57 np0005466012 nova_compute[192063]: 2025-10-02 12:14:57.186 2 DEBUG oslo_concurrency.lockutils [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:14:57 np0005466012 nova_compute[192063]: 2025-10-02 12:14:57.186 2 DEBUG oslo_concurrency.lockutils [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:14:57 np0005466012 nova_compute[192063]: 2025-10-02 12:14:57.186 2 DEBUG nova.network.neutron [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Refreshing network info cache for port b1b379f4-7eb3-40e5-8edd-d903c05484af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:14:57 np0005466012 nova_compute[192063]: 2025-10-02 12:14:57.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.086 2 DEBUG nova.compute.manager [req-5e56154b-4057-4357-a760-972e1445ed17 req-26eb508f-e8be-4fc8-80c0-f635fcde9213 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.087 2 DEBUG oslo_concurrency.lockutils [req-5e56154b-4057-4357-a760-972e1445ed17 req-26eb508f-e8be-4fc8-80c0-f635fcde9213 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.087 2 DEBUG oslo_concurrency.lockutils [req-5e56154b-4057-4357-a760-972e1445ed17 req-26eb508f-e8be-4fc8-80c0-f635fcde9213 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.087 2 DEBUG oslo_concurrency.lockutils [req-5e56154b-4057-4357-a760-972e1445ed17 req-26eb508f-e8be-4fc8-80c0-f635fcde9213 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.087 2 DEBUG nova.compute.manager [req-5e56154b-4057-4357-a760-972e1445ed17 req-26eb508f-e8be-4fc8-80c0-f635fcde9213 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] No waiting events found dispatching network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.087 2 WARNING nova.compute.manager [req-5e56154b-4057-4357-a760-972e1445ed17 req-26eb508f-e8be-4fc8-80c0-f635fcde9213 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received unexpected event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.661 2 DEBUG nova.network.neutron [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updated VIF entry in instance network info cache for port b1b379f4-7eb3-40e5-8edd-d903c05484af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.662 2 DEBUG nova.network.neutron [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updating instance_info_cache with network_info: [{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.703 2 DEBUG oslo_concurrency.lockutils [req-e20dea4a-cc82-49f6-b8ee-d5b4c8a9a229 req-e0a85e33-cfbc-4f69-a8f9-27eb7113d897 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:00 np0005466012 nova_compute[192063]: 2025-10-02 12:15:00.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:01 np0005466012 nova_compute[192063]: 2025-10-02 12:15:01.037 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:01 np0005466012 nova_compute[192063]: 2025-10-02 12:15:01.038 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:01 np0005466012 nova_compute[192063]: 2025-10-02 12:15:01.038 2 DEBUG nova.compute.manager [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Going to confirm migration 11 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:15:01 np0005466012 nova_compute[192063]: 2025-10-02 12:15:01.095 2 DEBUG nova.objects.instance [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'info_cache' on Instance uuid 6e45ea08-64c1-4434-9d80-94d4b7cec844 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:02.126 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:02.127 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:02.127 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.320 2 DEBUG nova.compute.manager [req-fdffc4a3-9b5f-4de7-874b-92df385e6a01 req-887e40f8-2d25-4199-8edb-fc337f753469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.321 2 DEBUG oslo_concurrency.lockutils [req-fdffc4a3-9b5f-4de7-874b-92df385e6a01 req-887e40f8-2d25-4199-8edb-fc337f753469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.321 2 DEBUG oslo_concurrency.lockutils [req-fdffc4a3-9b5f-4de7-874b-92df385e6a01 req-887e40f8-2d25-4199-8edb-fc337f753469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.322 2 DEBUG oslo_concurrency.lockutils [req-fdffc4a3-9b5f-4de7-874b-92df385e6a01 req-887e40f8-2d25-4199-8edb-fc337f753469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.322 2 DEBUG nova.compute.manager [req-fdffc4a3-9b5f-4de7-874b-92df385e6a01 req-887e40f8-2d25-4199-8edb-fc337f753469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] No waiting events found dispatching network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.323 2 WARNING nova.compute.manager [req-fdffc4a3-9b5f-4de7-874b-92df385e6a01 req-887e40f8-2d25-4199-8edb-fc337f753469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Received unexpected event network-vif-plugged-b1b379f4-7eb3-40e5-8edd-d903c05484af for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.371 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407287.369654, 6e45ea08-64c1-4434-9d80-94d4b7cec844 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.371 2 INFO nova.compute.manager [-] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.388 2 DEBUG nova.compute.manager [None req-cc95f65a-0884-4d76-91bd-496f7d403ffe - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.392 2 DEBUG nova.compute.manager [None req-cc95f65a-0884-4d76-91bd-496f7d403ffe - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:02 np0005466012 nova_compute[192063]: 2025-10-02 12:15:02.411 2 INFO nova.compute.manager [None req-cc95f65a-0884-4d76-91bd-496f7d403ffe - - - - - -] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  2 08:15:03 np0005466012 nova_compute[192063]: 2025-10-02 12:15:03.001 2 DEBUG neutronclient.v2_0.client [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b1b379f4-7eb3-40e5-8edd-d903c05484af for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:15:03 np0005466012 nova_compute[192063]: 2025-10-02 12:15:03.002 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:03 np0005466012 nova_compute[192063]: 2025-10-02 12:15:03.002 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:03 np0005466012 nova_compute[192063]: 2025-10-02 12:15:03.003 2 DEBUG nova.network.neutron [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.075 2 DEBUG nova.network.neutron [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: 6e45ea08-64c1-4434-9d80-94d4b7cec844] Updating instance_info_cache with network_info: [{"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.099 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-6e45ea08-64c1-4434-9d80-94d4b7cec844" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.099 2 DEBUG nova.objects.instance [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid 6e45ea08-64c1-4434-9d80-94d4b7cec844 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.123 2 DEBUG nova.virt.libvirt.vif [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:14:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-922274791',display_name='tempest-ServerDiskConfigTestJSON-server-922274791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-922274791',id=72,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:14:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-7a20jfzw',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:14:59Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=6e45ea08-64c1-4434-9d80-94d4b7cec844,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.124 2 DEBUG nova.network.os_vif_util [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "address": "fa:16:3e:5d:22:98", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1b379f4-7e", "ovs_interfaceid": "b1b379f4-7eb3-40e5-8edd-d903c05484af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.124 2 DEBUG nova.network.os_vif_util [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.125 2 DEBUG os_vif [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.127 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b379f4-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.127 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.129 2 INFO os_vif [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:22:98,bridge_name='br-int',has_traffic_filtering=True,id=b1b379f4-7eb3-40e5-8edd-d903c05484af,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1b379f4-7e')#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.129 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.129 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.215 2 DEBUG nova.compute.provider_tree [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.230 2 DEBUG nova.scheduler.client.report [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.278 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.443 2 INFO nova.scheduler.client.report [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Deleted allocation for migration 72a59e8d-287b-4dea-bb81-1c51eb265546#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.519 2 DEBUG oslo_concurrency.lockutils [None req-cbeb64c7-07fa-4bf8-b6c8-b20640dbb36c def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "6e45ea08-64c1-4434-9d80-94d4b7cec844" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:05 np0005466012 nova_compute[192063]: 2025-10-02 12:15:05.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:06 np0005466012 nova_compute[192063]: 2025-10-02 12:15:06.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:06.815 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:06.816 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:15:07 np0005466012 nova_compute[192063]: 2025-10-02 12:15:07.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:09.819 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:10 np0005466012 nova_compute[192063]: 2025-10-02 12:15:10.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:11 np0005466012 podman[230017]: 2025-10-02 12:15:11.152629683 +0000 UTC m=+0.071109133 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:15:11 np0005466012 podman[230018]: 2025-10-02 12:15:11.219863538 +0000 UTC m=+0.125530274 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:15:12 np0005466012 nova_compute[192063]: 2025-10-02 12:15:12.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.595 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.596 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.617 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.724 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.724 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.736 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.736 2 INFO nova.compute.claims [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.957 2 DEBUG nova.compute.provider_tree [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:13 np0005466012 nova_compute[192063]: 2025-10-02 12:15:13.986 2 DEBUG nova.scheduler.client.report [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.013 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.015 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.078 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.079 2 DEBUG nova.network.neutron [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.141 2 INFO nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.178 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.336 2 DEBUG nova.policy [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'def48c13fd6a43ba88836b753986a731', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffae703d68b24b9c89686c149113fc2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.349 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.350 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.351 2 INFO nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Creating image(s)#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.351 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.351 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.352 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.364 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.439 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.440 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.442 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.468 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.524 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.526 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.749 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk 1073741824" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.751 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.753 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.829 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.830 2 DEBUG nova.virt.disk.api [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Checking if we can resize image /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.831 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.889 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.890 2 DEBUG nova.virt.disk.api [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Cannot resize image /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.890 2 DEBUG nova.objects.instance [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid fa72d8b8-93c0-417b-9793-ccd611ffbb84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.909 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.910 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Ensure instance console log exists: /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.910 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.911 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:14 np0005466012 nova_compute[192063]: 2025-10-02 12:15:14.911 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:15 np0005466012 podman[230083]: 2025-10-02 12:15:15.130313165 +0000 UTC m=+0.047007688 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:15:15 np0005466012 nova_compute[192063]: 2025-10-02 12:15:15.277 2 DEBUG nova.network.neutron [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Successfully created port: 1692479a-54ef-45ae-a6a3-39c68408e4f6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:15:15 np0005466012 nova_compute[192063]: 2025-10-02 12:15:15.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:15:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.271 2 DEBUG nova.network.neutron [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Successfully updated port: 1692479a-54ef-45ae-a6a3-39c68408e4f6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.293 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.294 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.294 2 DEBUG nova.network.neutron [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.459 2 DEBUG nova.compute.manager [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-changed-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.460 2 DEBUG nova.compute.manager [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Refreshing instance network info cache due to event network-changed-1692479a-54ef-45ae-a6a3-39c68408e4f6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.460 2 DEBUG oslo_concurrency.lockutils [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:17 np0005466012 nova_compute[192063]: 2025-10-02 12:15:17.562 2 DEBUG nova.network.neutron [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:15:18 np0005466012 podman[230102]: 2025-10-02 12:15:18.143926177 +0000 UTC m=+0.061128308 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.442 2 DEBUG nova.network.neutron [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updating instance_info_cache with network_info: [{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.484 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.485 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance network_info: |[{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.485 2 DEBUG oslo_concurrency.lockutils [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.485 2 DEBUG nova.network.neutron [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Refreshing network info cache for port 1692479a-54ef-45ae-a6a3-39c68408e4f6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.488 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Start _get_guest_xml network_info=[{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.492 2 WARNING nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.498 2 DEBUG nova.virt.libvirt.host [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.499 2 DEBUG nova.virt.libvirt.host [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.509 2 DEBUG nova.virt.libvirt.host [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.510 2 DEBUG nova.virt.libvirt.host [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.511 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.511 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.511 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.512 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.512 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.512 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.512 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.513 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.513 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.513 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.513 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.514 2 DEBUG nova.virt.hardware [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.518 2 DEBUG nova.virt.libvirt.vif [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-480428625',display_name='tempest-ServerDiskConfigTestJSON-server-480428625',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-480428625',id=76,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-uec4q7qr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:14Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=fa72d8b8-93c0-417b-9793-ccd611ffbb84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.518 2 DEBUG nova.network.os_vif_util [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.519 2 DEBUG nova.network.os_vif_util [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.520 2 DEBUG nova.objects.instance [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'pci_devices' on Instance uuid fa72d8b8-93c0-417b-9793-ccd611ffbb84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.538 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <uuid>fa72d8b8-93c0-417b-9793-ccd611ffbb84</uuid>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <name>instance-0000004c</name>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-480428625</nova:name>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:15:18</nova:creationTime>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:user uuid="def48c13fd6a43ba88836b753986a731">tempest-ServerDiskConfigTestJSON-1763056137-project-member</nova:user>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:project uuid="ffae703d68b24b9c89686c149113fc2b">tempest-ServerDiskConfigTestJSON-1763056137</nova:project>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        <nova:port uuid="1692479a-54ef-45ae-a6a3-39c68408e4f6">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <entry name="serial">fa72d8b8-93c0-417b-9793-ccd611ffbb84</entry>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <entry name="uuid">fa72d8b8-93c0-417b-9793-ccd611ffbb84</entry>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:41:9a:b6"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <target dev="tap1692479a-54"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/console.log" append="off"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:15:18 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:15:18 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:15:18 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:15:18 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.540 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Preparing to wait for external event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.540 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.541 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.541 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.542 2 DEBUG nova.virt.libvirt.vif [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-480428625',display_name='tempest-ServerDiskConfigTestJSON-server-480428625',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-480428625',id=76,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-uec4q7qr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:14Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=fa72d8b8-93c0-417b-9793-ccd611ffbb84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.542 2 DEBUG nova.network.os_vif_util [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.542 2 DEBUG nova.network.os_vif_util [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.543 2 DEBUG os_vif [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.544 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.544 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1692479a-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1692479a-54, col_values=(('external_ids', {'iface-id': '1692479a-54ef-45ae-a6a3-39c68408e4f6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:9a:b6', 'vm-uuid': 'fa72d8b8-93c0-417b-9793-ccd611ffbb84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:18 np0005466012 NetworkManager[51207]: <info>  [1759407318.5499] manager: (tap1692479a-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.556 2 INFO os_vif [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54')#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.614 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.614 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.615 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] No VIF found with MAC fa:16:3e:41:9a:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:18 np0005466012 nova_compute[192063]: 2025-10-02 12:15:18.615 2 INFO nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Using config drive#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.264 2 INFO nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Creating config drive at /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.269 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1glsbyu5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.398 2 DEBUG oslo_concurrency.processutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1glsbyu5" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:19 np0005466012 kernel: tap1692479a-54: entered promiscuous mode
Oct  2 08:15:19 np0005466012 NetworkManager[51207]: <info>  [1759407319.4658] manager: (tap1692479a-54): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Oct  2 08:15:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:19Z|00266|binding|INFO|Claiming lport 1692479a-54ef-45ae-a6a3-39c68408e4f6 for this chassis.
Oct  2 08:15:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:19Z|00267|binding|INFO|1692479a-54ef-45ae-a6a3-39c68408e4f6: Claiming fa:16:3e:41:9a:b6 10.100.0.14
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.475 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:9a:b6 10.100.0.14'], port_security=['fa:16:3e:41:9a:b6 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fa72d8b8-93c0-417b-9793-ccd611ffbb84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=1692479a-54ef-45ae-a6a3-39c68408e4f6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.476 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 1692479a-54ef-45ae-a6a3-39c68408e4f6 in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b bound to our chassis#033[00m
Oct  2 08:15:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:19Z|00268|binding|INFO|Setting lport 1692479a-54ef-45ae-a6a3-39c68408e4f6 ovn-installed in OVS
Oct  2 08:15:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:19Z|00269|binding|INFO|Setting lport 1692479a-54ef-45ae-a6a3-39c68408e4f6 up in Southbound
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.478 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.488 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed5a1c7-1970-489a-9461-8be77489a6ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.489 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6de4737-c1 in ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.494 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6de4737-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.495 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[20b16057-d466-4ee2-9618-0a62ac8146b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.496 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3a009c-783b-4b8b-8586-7a34dcd9460b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 systemd-udevd[230142]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:19 np0005466012 systemd-machined[152114]: New machine qemu-31-instance-0000004c.
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.507 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[d46e5561-0cef-4158-89dc-4e394658a4f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 NetworkManager[51207]: <info>  [1759407319.5134] device (tap1692479a-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:15:19 np0005466012 NetworkManager[51207]: <info>  [1759407319.5149] device (tap1692479a-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:15:19 np0005466012 systemd[1]: Started Virtual Machine qemu-31-instance-0000004c.
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.536 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[668b5f72-7a74-4345-bf70-5e6b4d03cf4e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.562 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e3ad29-d805-43ca-8b8d-5574ad5cd09b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 systemd-udevd[230147]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:15:19 np0005466012 NetworkManager[51207]: <info>  [1759407319.5680] manager: (tapd6de4737-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.567 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5307dc26-b90a-4d00-be9c-a68f6955ab99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.596 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3c52d8-de24-45bb-8af6-7bacddd2041b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.601 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f15256c4-a480-4ea2-a5f5-adbc2d8607a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 NetworkManager[51207]: <info>  [1759407319.6240] device (tapd6de4737-c0): carrier: link connected
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.629 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[640341c6-6b51-4a67-b12b-136f75809af1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.646 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[305375a0-5c5d-4afc-b2f0-a3a32c21ba2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531323, 'reachable_time': 39918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230175, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.664 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ed77bbde-8457-4338-be4b-c25af3458ebc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:c91f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531323, 'tstamp': 531323}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230176, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.679 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7d4c90-4a42-4a86-8002-3e989df3ec37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6de4737-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:c9:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531323, 'reachable_time': 39918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230177, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.711 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3371c3c0-626c-4b3c-8340-092c25f220f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.764 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7126f5a9-017b-4991-b37a-ee4054000564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.765 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.765 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.766 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6de4737-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 NetworkManager[51207]: <info>  [1759407319.7693] manager: (tapd6de4737-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Oct  2 08:15:19 np0005466012 kernel: tapd6de4737-c0: entered promiscuous mode
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.773 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6de4737-c0, col_values=(('external_ids', {'iface-id': 'cc451eb7-bf34-4b54-96d8-b834f11e06fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:19Z|00270|binding|INFO|Releasing lport cc451eb7-bf34-4b54-96d8-b834f11e06fb from this chassis (sb_readonly=0)
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.777 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.778 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[097ccd76-c558-434f-a73f-f6dcea391b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.778 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.pid.haproxy
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID d6de4737-ca60-4c8d-bfd5-687f9366ec8b
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:15:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:19.779 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'env', 'PROCESS_TAG=haproxy-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6de4737-ca60-4c8d-bfd5-687f9366ec8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.840 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.840 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:19 np0005466012 nova_compute[192063]: 2025-10-02 12:15:19.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:20 np0005466012 podman[230215]: 2025-10-02 12:15:20.163423429 +0000 UTC m=+0.087257329 container create e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:15:20 np0005466012 podman[230215]: 2025-10-02 12:15:20.100725909 +0000 UTC m=+0.024559839 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:15:20 np0005466012 systemd[1]: Started libpod-conmon-e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5.scope.
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.221 2 DEBUG nova.network.neutron [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updated VIF entry in instance network info cache for port 1692479a-54ef-45ae-a6a3-39c68408e4f6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.222 2 DEBUG nova.network.neutron [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updating instance_info_cache with network_info: [{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:20 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:15:20 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ccc115cee90824afb3cad428590bc55c0215dd8eec68d61b715f13c006a34aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.244 2 DEBUG oslo_concurrency.lockutils [req-0f364d30-7986-4840-b7bb-f5ee11217aff req-9fab67ac-415c-4425-8c83-6f527fdcb3e6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:20 np0005466012 podman[230215]: 2025-10-02 12:15:20.256305372 +0000 UTC m=+0.180139292 container init e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:15:20 np0005466012 podman[230215]: 2025-10-02 12:15:20.262522923 +0000 UTC m=+0.186356823 container start e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:15:20 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [NOTICE]   (230234) : New worker (230236) forked
Oct  2 08:15:20 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [NOTICE]   (230234) : Loading success.
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.401 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407320.4009337, fa72d8b8-93c0-417b-9793-ccd611ffbb84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.402 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] VM Started (Lifecycle Event)#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.423 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.428 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407320.401065, fa72d8b8-93c0-417b-9793-ccd611ffbb84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.429 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.449 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.452 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.481 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:20 np0005466012 nova_compute[192063]: 2025-10-02 12:15:20.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:21 np0005466012 nova_compute[192063]: 2025-10-02 12:15:21.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:21 np0005466012 nova_compute[192063]: 2025-10-02 12:15:21.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:21 np0005466012 nova_compute[192063]: 2025-10-02 12:15:21.902 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:21 np0005466012 nova_compute[192063]: 2025-10-02 12:15:21.903 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:21 np0005466012 nova_compute[192063]: 2025-10-02 12:15:21.903 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:21 np0005466012 nova_compute[192063]: 2025-10-02 12:15:21.904 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.008 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.080 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.082 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.144 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.302 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.304 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5658MB free_disk=73.39126968383789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.304 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.304 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.430 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance fa72d8b8-93c0-417b-9793-ccd611ffbb84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.430 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.430 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.573 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.588 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.630 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:15:22 np0005466012 nova_compute[192063]: 2025-10-02 12:15:22.631 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:23 np0005466012 nova_compute[192063]: 2025-10-02 12:15:23.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:23 np0005466012 podman[230254]: 2025-10-02 12:15:23.936649278 +0000 UTC m=+0.082373714 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, distribution-scope=public, architecture=x86_64, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Oct  2 08:15:23 np0005466012 podman[230253]: 2025-10-02 12:15:23.936736121 +0000 UTC m=+0.081120220 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.290 2 DEBUG nova.compute.manager [req-c9bb9a8d-ed74-46d4-a96c-f9835831913e req-d0d7466c-6115-49fb-94ac-959a97abccd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.290 2 DEBUG oslo_concurrency.lockutils [req-c9bb9a8d-ed74-46d4-a96c-f9835831913e req-d0d7466c-6115-49fb-94ac-959a97abccd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.291 2 DEBUG oslo_concurrency.lockutils [req-c9bb9a8d-ed74-46d4-a96c-f9835831913e req-d0d7466c-6115-49fb-94ac-959a97abccd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.291 2 DEBUG oslo_concurrency.lockutils [req-c9bb9a8d-ed74-46d4-a96c-f9835831913e req-d0d7466c-6115-49fb-94ac-959a97abccd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.291 2 DEBUG nova.compute.manager [req-c9bb9a8d-ed74-46d4-a96c-f9835831913e req-d0d7466c-6115-49fb-94ac-959a97abccd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Processing event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.292 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.295 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407325.2953238, fa72d8b8-93c0-417b-9793-ccd611ffbb84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.295 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.297 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.300 2 INFO nova.virt.libvirt.driver [-] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance spawned successfully.#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.300 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.341 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.346 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.348 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.348 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.348 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.349 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.349 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.349 2 DEBUG nova.virt.libvirt.driver [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.388 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.474 2 INFO nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Took 11.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.475 2 DEBUG nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.572 2 INFO nova.compute.manager [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Took 11.88 seconds to build instance.#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.613 2 DEBUG oslo_concurrency.lockutils [None req-8f8be2d1-8302-4180-b201-f351ae86e95e def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.630 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.630 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:15:25 np0005466012 nova_compute[192063]: 2025-10-02 12:15:25.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:28 np0005466012 podman[230295]: 2025-10-02 12:15:28.142011393 +0000 UTC m=+0.059845272 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:15:28 np0005466012 podman[230296]: 2025-10-02 12:15:28.152726239 +0000 UTC m=+0.061626691 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:15:28 np0005466012 nova_compute[192063]: 2025-10-02 12:15:28.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.149 2 DEBUG nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.150 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.150 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.150 2 DEBUG oslo_concurrency.lockutils [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.151 2 DEBUG nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] No waiting events found dispatching network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.151 2 WARNING nova.compute.manager [req-ebfe5c63-739d-46bf-8ad9-201110bd9e5f req-80467203-c522-410d-a303-07272bdbc616 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received unexpected event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.860 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.860 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.860 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:15:29 np0005466012 nova_compute[192063]: 2025-10-02 12:15:29.860 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid fa72d8b8-93c0-417b-9793-ccd611ffbb84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:30 np0005466012 nova_compute[192063]: 2025-10-02 12:15:30.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.167 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updating instance_info_cache with network_info: [{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.194 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.194 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.967 2 DEBUG oslo_concurrency.lockutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.967 2 DEBUG oslo_concurrency.lockutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:33 np0005466012 nova_compute[192063]: 2025-10-02 12:15:33.968 2 DEBUG nova.network.neutron [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:35 np0005466012 nova_compute[192063]: 2025-10-02 12:15:35.371 2 DEBUG nova.network.neutron [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updating instance_info_cache with network_info: [{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:35 np0005466012 nova_compute[192063]: 2025-10-02 12:15:35.451 2 DEBUG oslo_concurrency.lockutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:35 np0005466012 nova_compute[192063]: 2025-10-02 12:15:35.614 2 DEBUG nova.virt.libvirt.driver [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:15:35 np0005466012 nova_compute[192063]: 2025-10-02 12:15:35.615 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Creating file /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/8b733c86423c48f48880f54bf7315111.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:15:35 np0005466012 nova_compute[192063]: 2025-10-02 12:15:35.615 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/8b733c86423c48f48880f54bf7315111.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:35 np0005466012 nova_compute[192063]: 2025-10-02 12:15:35.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:36 np0005466012 nova_compute[192063]: 2025-10-02 12:15:36.066 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/8b733c86423c48f48880f54bf7315111.tmp" returned: 1 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:36 np0005466012 nova_compute[192063]: 2025-10-02 12:15:36.067 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/8b733c86423c48f48880f54bf7315111.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:15:36 np0005466012 nova_compute[192063]: 2025-10-02 12:15:36.067 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Creating directory /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:15:36 np0005466012 nova_compute[192063]: 2025-10-02 12:15:36.068 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:36 np0005466012 nova_compute[192063]: 2025-10-02 12:15:36.300 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84" returned: 0 in 0.232s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:36 np0005466012 nova_compute[192063]: 2025-10-02 12:15:36.304 2 DEBUG nova.virt.libvirt.driver [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:15:38 np0005466012 nova_compute[192063]: 2025-10-02 12:15:38.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:40Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:9a:b6 10.100.0.14
Oct  2 08:15:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:40Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:9a:b6 10.100.0.14
Oct  2 08:15:40 np0005466012 nova_compute[192063]: 2025-10-02 12:15:40.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:42 np0005466012 podman[230353]: 2025-10-02 12:15:42.166354683 +0000 UTC m=+0.071993618 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:15:42 np0005466012 podman[230354]: 2025-10-02 12:15:42.186932182 +0000 UTC m=+0.097692647 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:15:43 np0005466012 nova_compute[192063]: 2025-10-02 12:15:43.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:45 np0005466012 nova_compute[192063]: 2025-10-02 12:15:45.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:46 np0005466012 podman[230402]: 2025-10-02 12:15:46.15688633 +0000 UTC m=+0.070352182 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:15:46 np0005466012 nova_compute[192063]: 2025-10-02 12:15:46.346 2 DEBUG nova.virt.libvirt.driver [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:15:48 np0005466012 nova_compute[192063]: 2025-10-02 12:15:48.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:48 np0005466012 kernel: tap1692479a-54 (unregistering): left promiscuous mode
Oct  2 08:15:48 np0005466012 NetworkManager[51207]: <info>  [1759407348.6203] device (tap1692479a-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:15:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:48Z|00271|binding|INFO|Releasing lport 1692479a-54ef-45ae-a6a3-39c68408e4f6 from this chassis (sb_readonly=0)
Oct  2 08:15:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:48Z|00272|binding|INFO|Setting lport 1692479a-54ef-45ae-a6a3-39c68408e4f6 down in Southbound
Oct  2 08:15:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:15:48Z|00273|binding|INFO|Removing iface tap1692479a-54 ovn-installed in OVS
Oct  2 08:15:48 np0005466012 nova_compute[192063]: 2025-10-02 12:15:48.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:48 np0005466012 nova_compute[192063]: 2025-10-02 12:15:48.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:48 np0005466012 nova_compute[192063]: 2025-10-02 12:15:48.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:48 np0005466012 podman[230424]: 2025-10-02 12:15:48.72030054 +0000 UTC m=+0.068039709 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:15:48 np0005466012 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Oct  2 08:15:48 np0005466012 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004c.scope: Consumed 14.094s CPU time.
Oct  2 08:15:48 np0005466012 systemd-machined[152114]: Machine qemu-31-instance-0000004c terminated.
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.360 2 INFO nova.virt.libvirt.driver [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.367 2 INFO nova.virt.libvirt.driver [-] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Instance destroyed successfully.#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.370 2 DEBUG nova.virt.libvirt.vif [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-480428625',display_name='tempest-ServerDiskConfigTestJSON-server-480428625',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-480428625',id=76,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-uec4q7qr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:15:33Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=fa72d8b8-93c0-417b-9793-ccd611ffbb84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "vif_mac": "fa:16:3e:41:9a:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.370 2 DEBUG nova.network.os_vif_util [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "vif_mac": "fa:16:3e:41:9a:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.371 2 DEBUG nova.network.os_vif_util [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.371 2 DEBUG os_vif [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.374 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1692479a-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.379 2 INFO os_vif [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54')#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.384 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.448 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.450 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.538 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.540 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Copying file /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk to 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:15:49 np0005466012 nova_compute[192063]: 2025-10-02 12:15:49.540 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.272 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:9a:b6 10.100.0.14'], port_security=['fa:16:3e:41:9a:b6 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fa72d8b8-93c0-417b-9793-ccd611ffbb84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffae703d68b24b9c89686c149113fc2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64970375-b20e-4c18-bfb5-2a0465f8be7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9476db85-7514-407a-b55a-3d3c703e8f7b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=1692479a-54ef-45ae-a6a3-39c68408e4f6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.273 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 1692479a-54ef-45ae-a6a3-39c68408e4f6 in datapath d6de4737-ca60-4c8d-bfd5-687f9366ec8b unbound from our chassis#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.275 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.276 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7684b0-b57c-42e0-a77c-685647be37f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.276 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b namespace which is not needed anymore#033[00m
Oct  2 08:15:50 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [NOTICE]   (230234) : haproxy version is 2.8.14-c23fe91
Oct  2 08:15:50 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [NOTICE]   (230234) : path to executable is /usr/sbin/haproxy
Oct  2 08:15:50 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [WARNING]  (230234) : Exiting Master process...
Oct  2 08:15:50 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [ALERT]    (230234) : Current worker (230236) exited with code 143 (Terminated)
Oct  2 08:15:50 np0005466012 neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b[230230]: [WARNING]  (230234) : All workers exited. Exiting... (0)
Oct  2 08:15:50 np0005466012 systemd[1]: libpod-e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5.scope: Deactivated successfully.
Oct  2 08:15:50 np0005466012 podman[230491]: 2025-10-02 12:15:50.481649918 +0000 UTC m=+0.116414793 container died e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.554 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "scp -r /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk" returned: 0 in 1.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.555 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Copying file /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.555 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk.config 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:50 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5-userdata-shm.mount: Deactivated successfully.
Oct  2 08:15:50 np0005466012 systemd[1]: var-lib-containers-storage-overlay-9ccc115cee90824afb3cad428590bc55c0215dd8eec68d61b715f13c006a34aa-merged.mount: Deactivated successfully.
Oct  2 08:15:50 np0005466012 podman[230491]: 2025-10-02 12:15:50.767534667 +0000 UTC m=+0.402299582 container cleanup e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:15:50 np0005466012 systemd[1]: libpod-conmon-e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5.scope: Deactivated successfully.
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.776 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "scp -C -r /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk.config 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.config" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.777 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Copying file /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.777 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk.info 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:50 np0005466012 podman[230525]: 2025-10-02 12:15:50.855866884 +0000 UTC m=+0.060237053 container remove e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.862 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7fa1ac9d-f4d4-4e0d-8a45-8bbafc05baed]: (4, ('Thu Oct  2 12:15:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5)\ne78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5\nThu Oct  2 12:15:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b (e78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5)\ne78723e8e34f677008ecbf2523c38f3b0d4d0fec9b9b855bd9ccdc67a9c8a0f5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.865 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7f974208-31bf-4ed5-8ac9-d791ad24df89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.867 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6de4737-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:50 np0005466012 kernel: tapd6de4737-c0: left promiscuous mode
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.914 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aa27da70-7794-4b15-96cc-e5af72d3c07b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.938 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[532803bd-8434-4898-a9f5-a7caea642ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.939 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4b5ae453-8247-4f11-95a5-ef74390ac5c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.956 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5b91b1b3-29c4-459a-a0bd-fb3d40e9b623]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531316, 'reachable_time': 19832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230542, 'error': None, 'target': 'ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.958 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6de4737-ca60-4c8d-bfd5-687f9366ec8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:15:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:50.958 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[5017eec3-2c47-453e-ab84-3ae8458cf97c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:15:50 np0005466012 systemd[1]: run-netns-ovnmeta\x2dd6de4737\x2dca60\x2d4c8d\x2dbfd5\x2d687f9366ec8b.mount: Deactivated successfully.
Oct  2 08:15:50 np0005466012 nova_compute[192063]: 2025-10-02 12:15:50.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:51 np0005466012 nova_compute[192063]: 2025-10-02 12:15:51.010 2 DEBUG oslo_concurrency.processutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] CMD "scp -C -r /var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84_resize/disk.info 192.168.122.100:/var/lib/nova/instances/fa72d8b8-93c0-417b-9793-ccd611ffbb84/disk.info" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:51 np0005466012 nova_compute[192063]: 2025-10-02 12:15:51.673 2 DEBUG neutronclient.v2_0.client [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 1692479a-54ef-45ae-a6a3-39c68408e4f6 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:15:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:51.746 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:15:51 np0005466012 nova_compute[192063]: 2025-10-02 12:15:51.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:51.747 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.315 2 DEBUG oslo_concurrency.lockutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.316 2 DEBUG oslo_concurrency.lockutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.316 2 DEBUG oslo_concurrency.lockutils [None req-36c1f53d-67b1-4e3f-9245-73827ca1dac5 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.838 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.839 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.856 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.965 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.965 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.971 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:15:52 np0005466012 nova_compute[192063]: 2025-10-02 12:15:52.971 2 INFO nova.compute.claims [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.120 2 DEBUG nova.compute.provider_tree [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.139 2 DEBUG nova.scheduler.client.report [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.176 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.177 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.272 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.272 2 DEBUG nova.network.neutron [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.306 2 INFO nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.346 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.691 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.692 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.693 2 INFO nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Creating image(s)#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.694 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "/var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.694 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.695 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.713 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.810 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.812 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.813 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.832 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.890 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.891 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.929 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.930 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.931 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.987 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.988 2 DEBUG nova.virt.disk.api [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Checking if we can resize image /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:15:53 np0005466012 nova_compute[192063]: 2025-10-02 12:15:53.989 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.049 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.050 2 DEBUG nova.virt.disk.api [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Cannot resize image /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.051 2 DEBUG nova.objects.instance [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'migration_context' on Instance uuid 53494088-1a72-4178-888a-661da86f801a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.066 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.068 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Ensure instance console log exists: /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.068 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.069 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.069 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:54 np0005466012 podman[230558]: 2025-10-02 12:15:54.167971231 +0000 UTC m=+0.083489114 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:15:54 np0005466012 podman[230559]: 2025-10-02 12:15:54.177955127 +0000 UTC m=+0.079829494 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, version=9.6, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.196 2 DEBUG nova.policy [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0ba8ddde504431b51e593c63f40361', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5db64e6714348c1a7f57bb53de80915', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.877 2 DEBUG nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-vif-unplugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.877 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.878 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.878 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.878 2 DEBUG nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] No waiting events found dispatching network-vif-unplugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.878 2 WARNING nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received unexpected event network-vif-unplugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.879 2 DEBUG nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.879 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.879 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.879 2 DEBUG oslo_concurrency.lockutils [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.879 2 DEBUG nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] No waiting events found dispatching network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:54 np0005466012 nova_compute[192063]: 2025-10-02 12:15:54.880 2 WARNING nova.compute.manager [req-8af04a29-47f0-4c00-998b-64033fb058c1 req-2f9e2aa5-7255-44d6-8af7-ba6627a36de7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received unexpected event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:15:55 np0005466012 nova_compute[192063]: 2025-10-02 12:15:55.866 2 DEBUG nova.network.neutron [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Successfully created port: 66efe9be-fb18-4baa-8d6f-d131de2e9283 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:15:55 np0005466012 nova_compute[192063]: 2025-10-02 12:15:55.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:15:56.749 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:56 np0005466012 nova_compute[192063]: 2025-10-02 12:15:56.804 2 DEBUG nova.network.neutron [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Successfully updated port: 66efe9be-fb18-4baa-8d6f-d131de2e9283 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:15:56 np0005466012 nova_compute[192063]: 2025-10-02 12:15:56.824 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-53494088-1a72-4178-888a-661da86f801a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:56 np0005466012 nova_compute[192063]: 2025-10-02 12:15:56.824 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-53494088-1a72-4178-888a-661da86f801a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:56 np0005466012 nova_compute[192063]: 2025-10-02 12:15:56.825 2 DEBUG nova.network.neutron [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:15:57 np0005466012 nova_compute[192063]: 2025-10-02 12:15:57.005 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-changed-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:57 np0005466012 nova_compute[192063]: 2025-10-02 12:15:57.006 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Refreshing instance network info cache due to event network-changed-1692479a-54ef-45ae-a6a3-39c68408e4f6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:15:57 np0005466012 nova_compute[192063]: 2025-10-02 12:15:57.006 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:57 np0005466012 nova_compute[192063]: 2025-10-02 12:15:57.006 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:57 np0005466012 nova_compute[192063]: 2025-10-02 12:15:57.007 2 DEBUG nova.network.neutron [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Refreshing network info cache for port 1692479a-54ef-45ae-a6a3-39c68408e4f6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:15:57 np0005466012 nova_compute[192063]: 2025-10-02 12:15:57.008 2 DEBUG nova.network.neutron [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:15:59 np0005466012 podman[230600]: 2025-10-02 12:15:59.135042213 +0000 UTC m=+0.049796834 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:15:59 np0005466012 podman[230599]: 2025-10-02 12:15:59.144168365 +0000 UTC m=+0.057184849 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.429 2 DEBUG nova.network.neutron [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Updating instance_info_cache with network_info: [{"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.447 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-53494088-1a72-4178-888a-661da86f801a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.447 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Instance network_info: |[{"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.449 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Start _get_guest_xml network_info=[{"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.454 2 WARNING nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.459 2 DEBUG nova.virt.libvirt.host [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.460 2 DEBUG nova.virt.libvirt.host [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.462 2 DEBUG nova.virt.libvirt.host [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.462 2 DEBUG nova.virt.libvirt.host [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.463 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.463 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.464 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.464 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.464 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.464 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.465 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.465 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.465 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.465 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.466 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.466 2 DEBUG nova.virt.hardware [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.469 2 DEBUG nova.virt.libvirt.vif [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-250873745',display_name='tempest-DeleteServersTestJSON-server-250873745',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-250873745',id=79,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-lykq5lm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:53Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=53494088-1a72-4178-888a-661da86f801a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.470 2 DEBUG nova.network.os_vif_util [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.470 2 DEBUG nova.network.os_vif_util [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.471 2 DEBUG nova.objects.instance [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 53494088-1a72-4178-888a-661da86f801a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.498 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <uuid>53494088-1a72-4178-888a-661da86f801a</uuid>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <name>instance-0000004f</name>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:name>tempest-DeleteServersTestJSON-server-250873745</nova:name>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:15:59</nova:creationTime>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:user uuid="0c0ba8ddde504431b51e593c63f40361">tempest-DeleteServersTestJSON-548982240-project-member</nova:user>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:project uuid="d5db64e6714348c1a7f57bb53de80915">tempest-DeleteServersTestJSON-548982240</nova:project>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        <nova:port uuid="66efe9be-fb18-4baa-8d6f-d131de2e9283">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <entry name="serial">53494088-1a72-4178-888a-661da86f801a</entry>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <entry name="uuid">53494088-1a72-4178-888a-661da86f801a</entry>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.config"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:8d:74:ae"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <target dev="tap66efe9be-fb"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/console.log" append="off"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:15:59 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:15:59 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:15:59 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:15:59 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.499 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Preparing to wait for external event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.499 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.500 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.500 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.500 2 DEBUG nova.virt.libvirt.vif [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-250873745',display_name='tempest-DeleteServersTestJSON-server-250873745',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-250873745',id=79,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-lykq5lm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:15:53Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=53494088-1a72-4178-888a-661da86f801a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.501 2 DEBUG nova.network.os_vif_util [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.501 2 DEBUG nova.network.os_vif_util [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.501 2 DEBUG os_vif [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.502 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.502 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.505 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66efe9be-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.505 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66efe9be-fb, col_values=(('external_ids', {'iface-id': '66efe9be-fb18-4baa-8d6f-d131de2e9283', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8d:74:ae', 'vm-uuid': '53494088-1a72-4178-888a-661da86f801a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005466012 NetworkManager[51207]: <info>  [1759407359.5073] manager: (tap66efe9be-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.514 2 INFO os_vif [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb')#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.562 2 DEBUG nova.compute.manager [req-2d7822b2-ec0b-413e-835f-92ce3fc6e3dc req-7eba754d-f65c-4405-bfd0-ded3a4d556fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.563 2 DEBUG oslo_concurrency.lockutils [req-2d7822b2-ec0b-413e-835f-92ce3fc6e3dc req-7eba754d-f65c-4405-bfd0-ded3a4d556fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.563 2 DEBUG oslo_concurrency.lockutils [req-2d7822b2-ec0b-413e-835f-92ce3fc6e3dc req-7eba754d-f65c-4405-bfd0-ded3a4d556fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.563 2 DEBUG oslo_concurrency.lockutils [req-2d7822b2-ec0b-413e-835f-92ce3fc6e3dc req-7eba754d-f65c-4405-bfd0-ded3a4d556fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.564 2 DEBUG nova.compute.manager [req-2d7822b2-ec0b-413e-835f-92ce3fc6e3dc req-7eba754d-f65c-4405-bfd0-ded3a4d556fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] No waiting events found dispatching network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.564 2 WARNING nova.compute.manager [req-2d7822b2-ec0b-413e-835f-92ce3fc6e3dc req-7eba754d-f65c-4405-bfd0-ded3a4d556fd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received unexpected event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.634 2 DEBUG nova.network.neutron [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updated VIF entry in instance network info cache for port 1692479a-54ef-45ae-a6a3-39c68408e4f6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.634 2 DEBUG nova.network.neutron [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updating instance_info_cache with network_info: [{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.670 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.672 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.672 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No VIF found with MAC fa:16:3e:8d:74:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.673 2 INFO nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Using config drive#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.701 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.702 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-changed-66efe9be-fb18-4baa-8d6f-d131de2e9283 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.703 2 DEBUG nova.compute.manager [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Refreshing instance network info cache due to event network-changed-66efe9be-fb18-4baa-8d6f-d131de2e9283. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.703 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-53494088-1a72-4178-888a-661da86f801a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.703 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-53494088-1a72-4178-888a-661da86f801a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:15:59 np0005466012 nova_compute[192063]: 2025-10-02 12:15:59.703 2 DEBUG nova.network.neutron [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Refreshing network info cache for port 66efe9be-fb18-4baa-8d6f-d131de2e9283 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.007 2 INFO nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Creating config drive at /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.config#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.014 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_xe9d0j3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.142 2 DEBUG oslo_concurrency.processutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_xe9d0j3" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:00 np0005466012 kernel: tap66efe9be-fb: entered promiscuous mode
Oct  2 08:16:00 np0005466012 NetworkManager[51207]: <info>  [1759407360.2041] manager: (tap66efe9be-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Oct  2 08:16:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:00Z|00274|binding|INFO|Claiming lport 66efe9be-fb18-4baa-8d6f-d131de2e9283 for this chassis.
Oct  2 08:16:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:00Z|00275|binding|INFO|66efe9be-fb18-4baa-8d6f-d131de2e9283: Claiming fa:16:3e:8d:74:ae 10.100.0.9
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.219 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:74:ae 10.100.0.9'], port_security=['fa:16:3e:8d:74:ae 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '53494088-1a72-4178-888a-661da86f801a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=66efe9be-fb18-4baa-8d6f-d131de2e9283) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.220 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 66efe9be-fb18-4baa-8d6f-d131de2e9283 in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 bound to our chassis#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.222 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97b8849-844c-4190-8b13-fd7a2d073ce8#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.232 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c89a89-6e2d-422b-9fb6-829778375443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.232 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97b8849-81 in ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.234 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97b8849-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.234 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1050a312-f1bb-4aee-8ee3-30dfbaa0ffad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.235 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[735f251e-f764-4785-ab4b-dc18da2821be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 systemd-machined[152114]: New machine qemu-32-instance-0000004f.
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.248 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[62623eaa-c55d-47a9-8752-bb533d074a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 systemd[1]: Started Virtual Machine qemu-32-instance-0000004f.
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.260 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[40a2cecb-247e-4414-a9df-635dc8eccead]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:00Z|00276|binding|INFO|Setting lport 66efe9be-fb18-4baa-8d6f-d131de2e9283 ovn-installed in OVS
Oct  2 08:16:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:00Z|00277|binding|INFO|Setting lport 66efe9be-fb18-4baa-8d6f-d131de2e9283 up in Southbound
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 systemd-udevd[230670]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.292 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9b2010-8afc-46a0-bd1a-57583fe8bf38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 NetworkManager[51207]: <info>  [1759407360.2955] device (tap66efe9be-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:00 np0005466012 NetworkManager[51207]: <info>  [1759407360.2962] device (tap66efe9be-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.297 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08553bb7-efe6-406b-aca2-d8abefe3a026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 systemd-udevd[230673]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:00 np0005466012 NetworkManager[51207]: <info>  [1759407360.3000] manager: (tapb97b8849-80): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.327 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f408ac05-5bcf-474d-a462-c84311473bc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.330 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[a251508d-d7d6-4580-b6d0-19a41ec71e23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 NetworkManager[51207]: <info>  [1759407360.3501] device (tapb97b8849-80): carrier: link connected
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.353 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[92be0d4a-95c6-4425-a1ed-0884148e73fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.370 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d4fecb19-843e-4458-82fb-117bdc12cb4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535396, 'reachable_time': 37903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230698, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.382 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d000b308-1469-4356-9dad-819416583efb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:e0b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535396, 'tstamp': 535396}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230699, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.397 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b05e517f-0414-4047-a4d4-1945dac68c2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535396, 'reachable_time': 37903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230700, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.423 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[66543cf9-4040-40bb-82b4-c4eb6d556c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.472 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a3b8bd-108a-4b09-8bdc-baf5dd2281ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.474 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.474 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.474 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97b8849-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:00 np0005466012 kernel: tapb97b8849-80: entered promiscuous mode
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 NetworkManager[51207]: <info>  [1759407360.4769] manager: (tapb97b8849-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.481 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97b8849-80, col_values=(('external_ids', {'iface-id': '055cf080-4472-4807-a697-69de84e96953'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:00Z|00278|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.484 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.485 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8b85b2e1-020e-470f-b059-ab21087f64e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.486 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:00 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:00.487 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'env', 'PROCESS_TAG=haproxy-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97b8849-844c-4190-8b13-fd7a2d073ce8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:00 np0005466012 podman[230738]: 2025-10-02 12:16:00.887154949 +0000 UTC m=+0.051350868 container create 99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:16:00 np0005466012 systemd[1]: Started libpod-conmon-99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92.scope.
Oct  2 08:16:00 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:16:00 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72a60c708058fd89a0240c41c60390bcd5258443432aac5eb98d83bd25a80171/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:00 np0005466012 podman[230738]: 2025-10-02 12:16:00.861216103 +0000 UTC m=+0.025412052 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.960 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407360.95943, 53494088-1a72-4178-888a-661da86f801a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:00 np0005466012 nova_compute[192063]: 2025-10-02 12:16:00.961 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:00 np0005466012 podman[230738]: 2025-10-02 12:16:00.9669706 +0000 UTC m=+0.131166549 container init 99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:16:00 np0005466012 podman[230738]: 2025-10-02 12:16:00.972365769 +0000 UTC m=+0.136561688 container start 99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:16:01 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [NOTICE]   (230758) : New worker (230760) forked
Oct  2 08:16:01 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [NOTICE]   (230758) : Loading success.
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.001 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.013 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407360.959807, 53494088-1a72-4178-888a-661da86f801a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.014 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.036 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.039 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.072 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.221 2 DEBUG nova.compute.manager [req-960bbf52-4094-412e-b31e-24a2af3e5b9b req-4d1e1271-0e68-4b4f-8301-19fb7faa8e65 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.222 2 DEBUG oslo_concurrency.lockutils [req-960bbf52-4094-412e-b31e-24a2af3e5b9b req-4d1e1271-0e68-4b4f-8301-19fb7faa8e65 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.222 2 DEBUG oslo_concurrency.lockutils [req-960bbf52-4094-412e-b31e-24a2af3e5b9b req-4d1e1271-0e68-4b4f-8301-19fb7faa8e65 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.223 2 DEBUG oslo_concurrency.lockutils [req-960bbf52-4094-412e-b31e-24a2af3e5b9b req-4d1e1271-0e68-4b4f-8301-19fb7faa8e65 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.223 2 DEBUG nova.compute.manager [req-960bbf52-4094-412e-b31e-24a2af3e5b9b req-4d1e1271-0e68-4b4f-8301-19fb7faa8e65 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Processing event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.223 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.227 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407361.2270632, 53494088-1a72-4178-888a-661da86f801a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.227 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.229 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.231 2 INFO nova.virt.libvirt.driver [-] [instance: 53494088-1a72-4178-888a-661da86f801a] Instance spawned successfully.#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.231 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.287 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.290 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.291 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.291 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.291 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.292 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.292 2 DEBUG nova.virt.libvirt.driver [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.296 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.332 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.358 2 INFO nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Took 7.67 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.358 2 DEBUG nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.585 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.585 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.585 2 DEBUG nova.compute.manager [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.620 2 DEBUG nova.objects.instance [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'info_cache' on Instance uuid fa72d8b8-93c0-417b-9793-ccd611ffbb84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.629 2 INFO nova.compute.manager [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Took 8.71 seconds to build instance.#033[00m
Oct  2 08:16:01 np0005466012 nova_compute[192063]: 2025-10-02 12:16:01.648 2 DEBUG oslo_concurrency.lockutils [None req-325f63a8-67f6-4c27-a034-6943fb7269b3 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.097 2 DEBUG neutronclient.v2_0.client [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 1692479a-54ef-45ae-a6a3-39c68408e4f6 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.098 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.098 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquired lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.098 2 DEBUG nova.network.neutron [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:02.127 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:02.128 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:02.128 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.478 2 DEBUG nova.compute.manager [req-39917549-6d33-40d2-bd45-966c2cbcb3cd req-248af73a-839c-48f1-ae2d-306f44b1ddad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.478 2 DEBUG oslo_concurrency.lockutils [req-39917549-6d33-40d2-bd45-966c2cbcb3cd req-248af73a-839c-48f1-ae2d-306f44b1ddad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.478 2 DEBUG oslo_concurrency.lockutils [req-39917549-6d33-40d2-bd45-966c2cbcb3cd req-248af73a-839c-48f1-ae2d-306f44b1ddad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.478 2 DEBUG oslo_concurrency.lockutils [req-39917549-6d33-40d2-bd45-966c2cbcb3cd req-248af73a-839c-48f1-ae2d-306f44b1ddad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.479 2 DEBUG nova.compute.manager [req-39917549-6d33-40d2-bd45-966c2cbcb3cd req-248af73a-839c-48f1-ae2d-306f44b1ddad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] No waiting events found dispatching network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.479 2 WARNING nova.compute.manager [req-39917549-6d33-40d2-bd45-966c2cbcb3cd req-248af73a-839c-48f1-ae2d-306f44b1ddad 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Received unexpected event network-vif-plugged-1692479a-54ef-45ae-a6a3-39c68408e4f6 for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.633 2 DEBUG nova.network.neutron [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Updated VIF entry in instance network info cache for port 66efe9be-fb18-4baa-8d6f-d131de2e9283. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.634 2 DEBUG nova.network.neutron [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Updating instance_info_cache with network_info: [{"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:02 np0005466012 nova_compute[192063]: 2025-10-02 12:16:02.707 2 DEBUG oslo_concurrency.lockutils [req-63158324-85e7-4284-9f6b-ab266bbf6503 req-a8fd7899-7ae3-49e3-af36-289b61700af8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-53494088-1a72-4178-888a-661da86f801a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.360 2 DEBUG nova.compute.manager [req-b4aeb969-264f-459b-b72f-8b60fbe67a90 req-32c960a1-d9a0-4cae-b9ce-3a6e6d2c706e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.361 2 DEBUG oslo_concurrency.lockutils [req-b4aeb969-264f-459b-b72f-8b60fbe67a90 req-32c960a1-d9a0-4cae-b9ce-3a6e6d2c706e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.361 2 DEBUG oslo_concurrency.lockutils [req-b4aeb969-264f-459b-b72f-8b60fbe67a90 req-32c960a1-d9a0-4cae-b9ce-3a6e6d2c706e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.361 2 DEBUG oslo_concurrency.lockutils [req-b4aeb969-264f-459b-b72f-8b60fbe67a90 req-32c960a1-d9a0-4cae-b9ce-3a6e6d2c706e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.362 2 DEBUG nova.compute.manager [req-b4aeb969-264f-459b-b72f-8b60fbe67a90 req-32c960a1-d9a0-4cae-b9ce-3a6e6d2c706e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] No waiting events found dispatching network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.362 2 WARNING nova.compute.manager [req-b4aeb969-264f-459b-b72f-8b60fbe67a90 req-32c960a1-d9a0-4cae-b9ce-3a6e6d2c706e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received unexpected event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.929 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407348.9276018, fa72d8b8-93c0-417b-9793-ccd611ffbb84 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.929 2 INFO nova.compute.manager [-] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.968 2 DEBUG nova.compute.manager [None req-05952d31-0de7-4f62-b725-8c90103ca52a - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.971 2 DEBUG nova.compute.manager [None req-05952d31-0de7-4f62-b725-8c90103ca52a - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:03 np0005466012 nova_compute[192063]: 2025-10-02 12:16:03.999 2 INFO nova.compute.manager [None req-05952d31-0de7-4f62-b725-8c90103ca52a - - - - - -] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.346 2 DEBUG nova.network.neutron [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] [instance: fa72d8b8-93c0-417b-9793-ccd611ffbb84] Updating instance_info_cache with network_info: [{"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.438 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Releasing lock "refresh_cache-fa72d8b8-93c0-417b-9793-ccd611ffbb84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.438 2 DEBUG nova.objects.instance [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lazy-loading 'migration_context' on Instance uuid fa72d8b8-93c0-417b-9793-ccd611ffbb84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.469 2 DEBUG nova.virt.libvirt.vif [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-480428625',display_name='tempest-ServerDiskConfigTestJSON-server-480428625',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-480428625',id=76,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:15:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffae703d68b24b9c89686c149113fc2b',ramdisk_id='',reservation_id='r-uec4q7qr',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1763056137',owner_user_name='tempest-ServerDiskConfigTestJSON-1763056137-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:15:59Z,user_data=None,user_id='def48c13fd6a43ba88836b753986a731',uuid=fa72d8b8-93c0-417b-9793-ccd611ffbb84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.470 2 DEBUG nova.network.os_vif_util [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converting VIF {"id": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "address": "fa:16:3e:41:9a:b6", "network": {"id": "d6de4737-ca60-4c8d-bfd5-687f9366ec8b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1853814355-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffae703d68b24b9c89686c149113fc2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1692479a-54", "ovs_interfaceid": "1692479a-54ef-45ae-a6a3-39c68408e4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.471 2 DEBUG nova.network.os_vif_util [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.471 2 DEBUG os_vif [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.473 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1692479a-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.473 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.475 2 INFO os_vif [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:9a:b6,bridge_name='br-int',has_traffic_filtering=True,id=1692479a-54ef-45ae-a6a3-39c68408e4f6,network=Network(d6de4737-ca60-4c8d-bfd5-687f9366ec8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1692479a-54')#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.475 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.476 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.565 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.566 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.566 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.566 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.566 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.591 2 DEBUG nova.compute.provider_tree [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.602 2 INFO nova.compute.manager [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Terminating instance#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.616 2 DEBUG nova.scheduler.client.report [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.626 2 DEBUG nova.compute.manager [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:04 np0005466012 kernel: tap66efe9be-fb (unregistering): left promiscuous mode
Oct  2 08:16:04 np0005466012 NetworkManager[51207]: <info>  [1759407364.6481] device (tap66efe9be-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:04Z|00279|binding|INFO|Releasing lport 66efe9be-fb18-4baa-8d6f-d131de2e9283 from this chassis (sb_readonly=0)
Oct  2 08:16:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:04Z|00280|binding|INFO|Setting lport 66efe9be-fb18-4baa-8d6f-d131de2e9283 down in Southbound
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:04Z|00281|binding|INFO|Removing iface tap66efe9be-fb ovn-installed in OVS
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.717 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:74:ae 10.100.0.9'], port_security=['fa:16:3e:8d:74:ae 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '53494088-1a72-4178-888a-661da86f801a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=66efe9be-fb18-4baa-8d6f-d131de2e9283) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.718 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 66efe9be-fb18-4baa-8d6f-d131de2e9283 in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.720 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.720 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[560c5a2f-3267-4dc8-aa18-8c1aa67eddc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.721 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace which is not needed anymore#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.726 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:04 np0005466012 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Oct  2 08:16:04 np0005466012 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004f.scope: Consumed 3.941s CPU time.
Oct  2 08:16:04 np0005466012 systemd-machined[152114]: Machine qemu-32-instance-0000004f terminated.
Oct  2 08:16:04 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [NOTICE]   (230758) : haproxy version is 2.8.14-c23fe91
Oct  2 08:16:04 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [NOTICE]   (230758) : path to executable is /usr/sbin/haproxy
Oct  2 08:16:04 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [WARNING]  (230758) : Exiting Master process...
Oct  2 08:16:04 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [ALERT]    (230758) : Current worker (230760) exited with code 143 (Terminated)
Oct  2 08:16:04 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[230754]: [WARNING]  (230758) : All workers exited. Exiting... (0)
Oct  2 08:16:04 np0005466012 systemd[1]: libpod-99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92.scope: Deactivated successfully.
Oct  2 08:16:04 np0005466012 podman[230793]: 2025-10-02 12:16:04.867205346 +0000 UTC m=+0.057994151 container died 99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.906 2 INFO nova.virt.libvirt.driver [-] [instance: 53494088-1a72-4178-888a-661da86f801a] Instance destroyed successfully.#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.907 2 DEBUG nova.objects.instance [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'resources' on Instance uuid 53494088-1a72-4178-888a-661da86f801a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:04 np0005466012 systemd[1]: var-lib-containers-storage-overlay-72a60c708058fd89a0240c41c60390bcd5258443432aac5eb98d83bd25a80171-merged.mount: Deactivated successfully.
Oct  2 08:16:04 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92-userdata-shm.mount: Deactivated successfully.
Oct  2 08:16:04 np0005466012 podman[230793]: 2025-10-02 12:16:04.915816177 +0000 UTC m=+0.106604972 container cleanup 99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:16:04 np0005466012 systemd[1]: libpod-conmon-99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92.scope: Deactivated successfully.
Oct  2 08:16:04 np0005466012 podman[230839]: 2025-10-02 12:16:04.987354242 +0000 UTC m=+0.044195401 container remove 99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.994 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[54f89fd7-78d5-4a3e-b199-fd0fc6269efb]: (4, ('Thu Oct  2 12:16:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92)\n99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92\nThu Oct  2 12:16:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92)\n99a9694c61d7580cbcb1c94aba8809bf5cd265c28de598f0f0e42136c1e6aa92\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.996 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4c720a-e308-4737-a13d-f90ad6caea2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:04.996 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:04 np0005466012 nova_compute[192063]: 2025-10-02 12:16:04.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466012 kernel: tapb97b8849-80: left promiscuous mode
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:05.017 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[67ea1b1a-50d5-499a-b2c8-822d95963d37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:05.046 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[40a6c589-a9c8-4fb5-a17d-7e12bc884fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:05.047 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[994770e1-7fb6-4716-a087-3430f3d6df4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:05.068 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d1544b31-97c5-47f7-b562-b79b5044dbb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535390, 'reachable_time': 19288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230857, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:05.070 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:16:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:05.070 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[af0f6e4b-3c6e-416a-926e-cf55840bde02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:05 np0005466012 systemd[1]: run-netns-ovnmeta\x2db97b8849\x2d844c\x2d4190\x2d8b13\x2dfd7a2d073ce8.mount: Deactivated successfully.
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.212 2 DEBUG nova.virt.libvirt.vif [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-250873745',display_name='tempest-DeleteServersTestJSON-server-250873745',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-250873745',id=79,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-lykq5lm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:01Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=53494088-1a72-4178-888a-661da86f801a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.213 2 DEBUG nova.network.os_vif_util [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "address": "fa:16:3e:8d:74:ae", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66efe9be-fb", "ovs_interfaceid": "66efe9be-fb18-4baa-8d6f-d131de2e9283", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.215 2 DEBUG nova.network.os_vif_util [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.215 2 DEBUG os_vif [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66efe9be-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.225 2 INFO os_vif [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:74:ae,bridge_name='br-int',has_traffic_filtering=True,id=66efe9be-fb18-4baa-8d6f-d131de2e9283,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66efe9be-fb')#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.226 2 INFO nova.virt.libvirt.driver [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Deleting instance files /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a_del#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.227 2 INFO nova.virt.libvirt.driver [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Deletion of /var/lib/nova/instances/53494088-1a72-4178-888a-661da86f801a_del complete#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.393 2 INFO nova.scheduler.client.report [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Deleted allocation for migration 23ba7d64-b31c-4bd0-8f82-77a95cd8e782#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.430 2 INFO nova.compute.manager [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.431 2 DEBUG oslo.service.loopingcall [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.431 2 DEBUG nova.compute.manager [-] [instance: 53494088-1a72-4178-888a-661da86f801a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.432 2 DEBUG nova.network.neutron [-] [instance: 53494088-1a72-4178-888a-661da86f801a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:05 np0005466012 nova_compute[192063]: 2025-10-02 12:16:05.543 2 DEBUG oslo_concurrency.lockutils [None req-14c20988-1796-49dc-be2c-4fb7dc820279 def48c13fd6a43ba88836b753986a731 ffae703d68b24b9c89686c149113fc2b - - default default] Lock "fa72d8b8-93c0-417b-9793-ccd611ffbb84" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.525 2 DEBUG nova.compute.manager [req-b97ae040-057d-4c0c-9cf0-9797d289b132 req-9d84017b-68d4-4ef2-bc67-4ca48b78756b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-vif-unplugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.525 2 DEBUG oslo_concurrency.lockutils [req-b97ae040-057d-4c0c-9cf0-9797d289b132 req-9d84017b-68d4-4ef2-bc67-4ca48b78756b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.526 2 DEBUG oslo_concurrency.lockutils [req-b97ae040-057d-4c0c-9cf0-9797d289b132 req-9d84017b-68d4-4ef2-bc67-4ca48b78756b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.526 2 DEBUG oslo_concurrency.lockutils [req-b97ae040-057d-4c0c-9cf0-9797d289b132 req-9d84017b-68d4-4ef2-bc67-4ca48b78756b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.526 2 DEBUG nova.compute.manager [req-b97ae040-057d-4c0c-9cf0-9797d289b132 req-9d84017b-68d4-4ef2-bc67-4ca48b78756b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] No waiting events found dispatching network-vif-unplugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:06 np0005466012 nova_compute[192063]: 2025-10-02 12:16:06.526 2 DEBUG nova.compute.manager [req-b97ae040-057d-4c0c-9cf0-9797d289b132 req-9d84017b-68d4-4ef2-bc67-4ca48b78756b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-vif-unplugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.114 2 DEBUG nova.network.neutron [-] [instance: 53494088-1a72-4178-888a-661da86f801a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.217 2 INFO nova.compute.manager [-] [instance: 53494088-1a72-4178-888a-661da86f801a] Took 1.79 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.336 2 DEBUG nova.compute.manager [req-e8df69e6-1988-4e72-a0ad-93a7020dd317 req-b02d9929-ed5b-4853-95bd-2b3e6c2e04a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-vif-deleted-66efe9be-fb18-4baa-8d6f-d131de2e9283 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.376 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.376 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.434 2 DEBUG nova.compute.provider_tree [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.461 2 DEBUG nova.scheduler.client.report [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.515 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.567 2 INFO nova.scheduler.client.report [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Deleted allocations for instance 53494088-1a72-4178-888a-661da86f801a#033[00m
Oct  2 08:16:07 np0005466012 nova_compute[192063]: 2025-10-02 12:16:07.687 2 DEBUG oslo_concurrency.lockutils [None req-17125dcc-0bec-450b-90b8-be70b29cdfed 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "53494088-1a72-4178-888a-661da86f801a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.482 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.484 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.503 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.621 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.622 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.632 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.633 2 INFO nova.compute.claims [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.640 2 DEBUG nova.compute.manager [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.640 2 DEBUG oslo_concurrency.lockutils [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "53494088-1a72-4178-888a-661da86f801a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.640 2 DEBUG oslo_concurrency.lockutils [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.641 2 DEBUG oslo_concurrency.lockutils [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "53494088-1a72-4178-888a-661da86f801a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.641 2 DEBUG nova.compute.manager [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] No waiting events found dispatching network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.641 2 WARNING nova.compute.manager [req-1d02863c-0e86-4888-8589-b2b2e8e29f29 req-0a6c33f9-ef2f-45b6-ae2d-eefb1550c4f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 53494088-1a72-4178-888a-661da86f801a] Received unexpected event network-vif-plugged-66efe9be-fb18-4baa-8d6f-d131de2e9283 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:16:08 np0005466012 nova_compute[192063]: 2025-10-02 12:16:08.996 2 DEBUG nova.compute.provider_tree [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.041 2 DEBUG nova.scheduler.client.report [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.098 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.099 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.229 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.230 2 DEBUG nova.network.neutron [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.272 2 INFO nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.328 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.435 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.436 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.437 2 INFO nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Creating image(s)#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.438 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.438 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.439 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.456 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.562 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.564 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.564 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.575 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.634 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.636 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.669 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.671 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.672 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.744 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.746 2 DEBUG nova.virt.disk.api [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Checking if we can resize image /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.747 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.803 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.805 2 DEBUG nova.virt.disk.api [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Cannot resize image /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.806 2 DEBUG nova.objects.instance [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'migration_context' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.829 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.830 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Ensure instance console log exists: /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.830 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.832 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.832 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:09 np0005466012 nova_compute[192063]: 2025-10-02 12:16:09.871 2 DEBUG nova.policy [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '341760d37e2c44209429d234ca5f01ae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed7af923ad494ac5b7dbd3d8403dc33e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:10 np0005466012 nova_compute[192063]: 2025-10-02 12:16:10.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:11 np0005466012 nova_compute[192063]: 2025-10-02 12:16:11.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:13 np0005466012 podman[230873]: 2025-10-02 12:16:13.195808529 +0000 UTC m=+0.101699048 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:16:13 np0005466012 podman[230874]: 2025-10-02 12:16:13.196005584 +0000 UTC m=+0.109692387 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:16:14 np0005466012 nova_compute[192063]: 2025-10-02 12:16:14.053 2 DEBUG nova.network.neutron [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Successfully created port: 58ade8f3-49a2-49fd-ad21-2626c34768be _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:15 np0005466012 nova_compute[192063]: 2025-10-02 12:16:15.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:16 np0005466012 nova_compute[192063]: 2025-10-02 12:16:16.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:17 np0005466012 podman[230923]: 2025-10-02 12:16:17.183473056 +0000 UTC m=+0.095124556 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.668 2 DEBUG nova.network.neutron [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Successfully updated port: 58ade8f3-49a2-49fd-ad21-2626c34768be _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.687 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.687 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquired lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.687 2 DEBUG nova.network.neutron [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.815 2 DEBUG nova.compute.manager [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-changed-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.815 2 DEBUG nova.compute.manager [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Refreshing instance network info cache due to event network-changed-58ade8f3-49a2-49fd-ad21-2626c34768be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:17 np0005466012 nova_compute[192063]: 2025-10-02 12:16:17.816 2 DEBUG oslo_concurrency.lockutils [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:18 np0005466012 nova_compute[192063]: 2025-10-02 12:16:18.083 2 DEBUG nova.network.neutron [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:19 np0005466012 podman[230942]: 2025-10-02 12:16:19.157754181 +0000 UTC m=+0.076353737 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.461 2 DEBUG nova.network.neutron [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.480 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Releasing lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.481 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance network_info: |[{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.481 2 DEBUG oslo_concurrency.lockutils [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.482 2 DEBUG nova.network.neutron [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Refreshing network info cache for port 58ade8f3-49a2-49fd-ad21-2626c34768be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.487 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Start _get_guest_xml network_info=[{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.495 2 WARNING nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.506 2 DEBUG nova.virt.libvirt.host [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.506 2 DEBUG nova.virt.libvirt.host [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.511 2 DEBUG nova.virt.libvirt.host [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.512 2 DEBUG nova.virt.libvirt.host [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.514 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.515 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.515 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.516 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.516 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.517 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.517 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.518 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.518 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.519 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.519 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.520 2 DEBUG nova.virt.hardware [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.527 2 DEBUG nova.virt.libvirt.vif [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-992705125',display_name='tempest-SecurityGroupsTestJSON-server-992705125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-992705125',id=80,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ed7af923ad494ac5b7dbd3d8403dc33e',ramdisk_id='',reservation_id='r-imbi9e46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-431508526',owner_user_name='tempest-SecurityGroupsTestJSON-431508526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:09Z,user_data=None,user_id='341760d37e2c44209429d234ca5f01ae',uuid=52aaa8ad-df8d-46de-a710-4463776cfe6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.528 2 DEBUG nova.network.os_vif_util [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converting VIF {"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.529 2 DEBUG nova.network.os_vif_util [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.532 2 DEBUG nova.objects.instance [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'pci_devices' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.547 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <uuid>52aaa8ad-df8d-46de-a710-4463776cfe6a</uuid>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <name>instance-00000050</name>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:name>tempest-SecurityGroupsTestJSON-server-992705125</nova:name>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:16:19</nova:creationTime>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:user uuid="341760d37e2c44209429d234ca5f01ae">tempest-SecurityGroupsTestJSON-431508526-project-member</nova:user>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:project uuid="ed7af923ad494ac5b7dbd3d8403dc33e">tempest-SecurityGroupsTestJSON-431508526</nova:project>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        <nova:port uuid="58ade8f3-49a2-49fd-ad21-2626c34768be">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <entry name="serial">52aaa8ad-df8d-46de-a710-4463776cfe6a</entry>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <entry name="uuid">52aaa8ad-df8d-46de-a710-4463776cfe6a</entry>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:db:8a:55"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <target dev="tap58ade8f3-49"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/console.log" append="off"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:16:19 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:16:19 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:16:19 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:16:19 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.549 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Preparing to wait for external event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.550 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.550 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.551 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.552 2 DEBUG nova.virt.libvirt.vif [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-992705125',display_name='tempest-SecurityGroupsTestJSON-server-992705125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-992705125',id=80,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ed7af923ad494ac5b7dbd3d8403dc33e',ramdisk_id='',reservation_id='r-imbi9e46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-431508526',owner_user_name='tempest-SecurityGroupsTestJSON-431508526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:09Z,user_data=None,user_id='341760d37e2c44209429d234ca5f01ae',uuid=52aaa8ad-df8d-46de-a710-4463776cfe6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.552 2 DEBUG nova.network.os_vif_util [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converting VIF {"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.553 2 DEBUG nova.network.os_vif_util [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.554 2 DEBUG os_vif [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.555 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.556 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58ade8f3-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.561 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58ade8f3-49, col_values=(('external_ids', {'iface-id': '58ade8f3-49a2-49fd-ad21-2626c34768be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:8a:55', 'vm-uuid': '52aaa8ad-df8d-46de-a710-4463776cfe6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466012 NetworkManager[51207]: <info>  [1759407379.5635] manager: (tap58ade8f3-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.569 2 INFO os_vif [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49')#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.642 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.643 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.643 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] No VIF found with MAC fa:16:3e:db:8a:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.644 2 INFO nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Using config drive#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.839 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.839 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.904 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407364.9036238, 53494088-1a72-4178-888a-661da86f801a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.907 2 INFO nova.compute.manager [-] [instance: 53494088-1a72-4178-888a-661da86f801a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:19 np0005466012 nova_compute[192063]: 2025-10-02 12:16:19.925 2 DEBUG nova.compute.manager [None req-4884f378-3001-4fdf-b1e0-8c4b9d1d3c34 - - - - - -] [instance: 53494088-1a72-4178-888a-661da86f801a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.168 2 INFO nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Creating config drive at /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.175 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mp9t7t6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.316 2 DEBUG oslo_concurrency.processutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mp9t7t6" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:20 np0005466012 kernel: tap58ade8f3-49: entered promiscuous mode
Oct  2 08:16:20 np0005466012 NetworkManager[51207]: <info>  [1759407380.3717] manager: (tap58ade8f3-49): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Oct  2 08:16:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:20Z|00282|binding|INFO|Claiming lport 58ade8f3-49a2-49fd-ad21-2626c34768be for this chassis.
Oct  2 08:16:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:20Z|00283|binding|INFO|58ade8f3-49a2-49fd-ad21-2626c34768be: Claiming fa:16:3e:db:8a:55 10.100.0.7
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.383 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:8a:55 10.100.0.7'], port_security=['fa:16:3e:db:8a:55 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '52aaa8ad-df8d-46de-a710-4463776cfe6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ed7af923ad494ac5b7dbd3d8403dc33e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f6f46a30-ca89-45c9-b4fd-d5c78d4ee0ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08fc185f-7900-4a64-ba36-f229e6cb956d, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=58ade8f3-49a2-49fd-ad21-2626c34768be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.384 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 58ade8f3-49a2-49fd-ad21-2626c34768be in datapath 5716ac1c-acf7-48a7-8b93-dda3a5af31f6 bound to our chassis#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.385 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5716ac1c-acf7-48a7-8b93-dda3a5af31f6#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.396 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3f24161d-1796-466b-aba3-d0420a56d528]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.398 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5716ac1c-a1 in ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.400 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5716ac1c-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.400 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0365e1bf-3ee7-4ecf-9be3-fbf432d32806]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 systemd-udevd[230984]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.400 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d02a4ffa-b08e-4dbc-ba14-f976b03f34e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 systemd-machined[152114]: New machine qemu-33-instance-00000050.
Oct  2 08:16:20 np0005466012 NetworkManager[51207]: <info>  [1759407380.4123] device (tap58ade8f3-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:20 np0005466012 NetworkManager[51207]: <info>  [1759407380.4129] device (tap58ade8f3-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.412 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b22206-409b-448e-8590-a59e89e0ad35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:20Z|00284|binding|INFO|Setting lport 58ade8f3-49a2-49fd-ad21-2626c34768be ovn-installed in OVS
Oct  2 08:16:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:20Z|00285|binding|INFO|Setting lport 58ade8f3-49a2-49fd-ad21-2626c34768be up in Southbound
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.438 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e5529ddc-a919-4245-a018-0d09703fac12]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 systemd[1]: Started Virtual Machine qemu-33-instance-00000050.
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.466 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d951b2cc-11b8-4600-8156-a486916fa26b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 systemd-udevd[230987]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.473 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d549c1-af72-4361-84e3-1d9be55fb32c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 NetworkManager[51207]: <info>  [1759407380.4755] manager: (tap5716ac1c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.506 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[51da5fde-81b4-479b-b44e-0069e64cf9bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.511 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[cecdb483-c666-412b-8991-96f83cc73dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 NetworkManager[51207]: <info>  [1759407380.5363] device (tap5716ac1c-a0): carrier: link connected
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.542 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[069af36b-b659-48c5-92a8-2b8fbfd33a7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.561 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d52ad731-d42b-452b-8572-d7fd9e9bd68b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5716ac1c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f8:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537414, 'reachable_time': 19076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231016, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.577 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a943fb0a-2d9a-4c5f-ba1a-dc7686c3bc4c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:f83e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537414, 'tstamp': 537414}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231017, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.593 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9f05d8-01a3-419d-b2e0-fabb7393f942]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5716ac1c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f8:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537414, 'reachable_time': 19076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231018, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.622 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5c3315f3-e0bf-4be8-8f78-7a5d1e22f2bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.678 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b917222b-615f-4898-aa0b-7e93da7013b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.679 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5716ac1c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.682 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.682 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5716ac1c-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 NetworkManager[51207]: <info>  [1759407380.6851] manager: (tap5716ac1c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Oct  2 08:16:20 np0005466012 kernel: tap5716ac1c-a0: entered promiscuous mode
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.688 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5716ac1c-a0, col_values=(('external_ids', {'iface-id': 'cc8e73bf-6cd9-4487-9685-abdace89cf29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:20Z|00286|binding|INFO|Releasing lport cc8e73bf-6cd9-4487-9685-abdace89cf29 from this chassis (sb_readonly=0)
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.690 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.691 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c7db085a-db47-4ae9-ae93-ff3b86adb312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.692 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-5716ac1c-acf7-48a7-8b93-dda3a5af31f6
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.pid.haproxy
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 5716ac1c-acf7-48a7-8b93-dda3a5af31f6
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:20.692 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'env', 'PROCESS_TAG=haproxy-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:20 np0005466012 nova_compute[192063]: 2025-10-02 12:16:20.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:21 np0005466012 podman[231056]: 2025-10-02 12:16:21.028481109 +0000 UTC m=+0.055292577 container create 58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:21 np0005466012 systemd[1]: Started libpod-conmon-58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37.scope.
Oct  2 08:16:21 np0005466012 podman[231056]: 2025-10-02 12:16:20.997209455 +0000 UTC m=+0.024020923 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:21 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:16:21 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/151ba586a76bd4b005d8814cc39280fd06dc8e344a8dedcdee40d0e8b7062773/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.119 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407381.1191502, 52aaa8ad-df8d-46de-a710-4463776cfe6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.120 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:21 np0005466012 podman[231056]: 2025-10-02 12:16:21.124993441 +0000 UTC m=+0.151804929 container init 58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:21 np0005466012 podman[231056]: 2025-10-02 12:16:21.130606116 +0000 UTC m=+0.157417574 container start 58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.156 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [NOTICE]   (231075) : New worker (231077) forked
Oct  2 08:16:21 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [NOTICE]   (231075) : Loading success.
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.162 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407381.1200843, 52aaa8ad-df8d-46de-a710-4463776cfe6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.163 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.194 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.198 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.223 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.252 2 DEBUG nova.compute.manager [req-952d151e-8282-4dea-8dc5-2135c48cf698 req-4c6c7939-0e1e-4c30-be05-2f0a2b6b33a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.252 2 DEBUG oslo_concurrency.lockutils [req-952d151e-8282-4dea-8dc5-2135c48cf698 req-4c6c7939-0e1e-4c30-be05-2f0a2b6b33a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.252 2 DEBUG oslo_concurrency.lockutils [req-952d151e-8282-4dea-8dc5-2135c48cf698 req-4c6c7939-0e1e-4c30-be05-2f0a2b6b33a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.253 2 DEBUG oslo_concurrency.lockutils [req-952d151e-8282-4dea-8dc5-2135c48cf698 req-4c6c7939-0e1e-4c30-be05-2f0a2b6b33a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.253 2 DEBUG nova.compute.manager [req-952d151e-8282-4dea-8dc5-2135c48cf698 req-4c6c7939-0e1e-4c30-be05-2f0a2b6b33a4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Processing event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.253 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.257 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407381.2577636, 52aaa8ad-df8d-46de-a710-4463776cfe6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.258 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.259 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.263 2 INFO nova.virt.libvirt.driver [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance spawned successfully.#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.263 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.282 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.288 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.292 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.292 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.293 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.293 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.293 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.294 2 DEBUG nova.virt.libvirt.driver [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.349 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.407 2 INFO nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Took 11.97 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.407 2 DEBUG nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.484 2 DEBUG nova.network.neutron [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updated VIF entry in instance network info cache for port 58ade8f3-49a2-49fd-ad21-2626c34768be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.484 2 DEBUG nova.network.neutron [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.519 2 DEBUG oslo_concurrency.lockutils [req-b3ccf8ca-e546-4178-8698-34adfc51c88c req-e603cb12-43e5-4069-9c48-671e5836f380 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.529 2 INFO nova.compute.manager [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Took 12.95 seconds to build instance.#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.560 2 DEBUG oslo_concurrency.lockutils [None req-d39d7d72-d7f6-4cee-98c5-8b973bc8f907 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:21 np0005466012 nova_compute[192063]: 2025-10-02 12:16:21.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:22 np0005466012 nova_compute[192063]: 2025-10-02 12:16:22.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:22 np0005466012 nova_compute[192063]: 2025-10-02 12:16:22.867 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:22 np0005466012 nova_compute[192063]: 2025-10-02 12:16:22.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:22 np0005466012 nova_compute[192063]: 2025-10-02 12:16:22.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:22 np0005466012 nova_compute[192063]: 2025-10-02 12:16:22.868 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:16:22 np0005466012 nova_compute[192063]: 2025-10-02 12:16:22.938 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.004 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.005 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.071 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.247 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.249 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5565MB free_disk=73.38728332519531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.250 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.250 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.348 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 52aaa8ad-df8d-46de-a710-4463776cfe6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.349 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.349 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.366 2 DEBUG nova.compute.manager [req-996ad928-f9a3-40d7-adfa-b39781d378d0 req-c8292f11-4103-4a0a-a926-2dae37a29048 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.367 2 DEBUG oslo_concurrency.lockutils [req-996ad928-f9a3-40d7-adfa-b39781d378d0 req-c8292f11-4103-4a0a-a926-2dae37a29048 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.367 2 DEBUG oslo_concurrency.lockutils [req-996ad928-f9a3-40d7-adfa-b39781d378d0 req-c8292f11-4103-4a0a-a926-2dae37a29048 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.368 2 DEBUG oslo_concurrency.lockutils [req-996ad928-f9a3-40d7-adfa-b39781d378d0 req-c8292f11-4103-4a0a-a926-2dae37a29048 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.368 2 DEBUG nova.compute.manager [req-996ad928-f9a3-40d7-adfa-b39781d378d0 req-c8292f11-4103-4a0a-a926-2dae37a29048 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.369 2 WARNING nova.compute.manager [req-996ad928-f9a3-40d7-adfa-b39781d378d0 req-c8292f11-4103-4a0a-a926-2dae37a29048 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received unexpected event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.375 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.400 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.401 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.418 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.448 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.516 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.531 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.573 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:16:23 np0005466012 nova_compute[192063]: 2025-10-02 12:16:23.574 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.575 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.744 2 DEBUG nova.compute.manager [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-changed-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.744 2 DEBUG nova.compute.manager [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Refreshing instance network info cache due to event network-changed-58ade8f3-49a2-49fd-ad21-2626c34768be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.744 2 DEBUG oslo_concurrency.lockutils [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.745 2 DEBUG oslo_concurrency.lockutils [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.745 2 DEBUG nova.network.neutron [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Refreshing network info cache for port 58ade8f3-49a2-49fd-ad21-2626c34768be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:24 np0005466012 nova_compute[192063]: 2025-10-02 12:16:24.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:16:25 np0005466012 podman[231094]: 2025-10-02 12:16:25.168401977 +0000 UTC m=+0.076057760 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Oct  2 08:16:25 np0005466012 podman[231093]: 2025-10-02 12:16:25.194628271 +0000 UTC m=+0.097268905 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:16:26 np0005466012 nova_compute[192063]: 2025-10-02 12:16:26.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:26 np0005466012 nova_compute[192063]: 2025-10-02 12:16:26.895 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:26 np0005466012 nova_compute[192063]: 2025-10-02 12:16:26.895 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:26 np0005466012 nova_compute[192063]: 2025-10-02 12:16:26.895 2 INFO nova.compute.manager [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Rebooting instance#033[00m
Oct  2 08:16:26 np0005466012 nova_compute[192063]: 2025-10-02 12:16:26.908 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:29 np0005466012 nova_compute[192063]: 2025-10-02 12:16:29.129 2 DEBUG nova.network.neutron [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updated VIF entry in instance network info cache for port 58ade8f3-49a2-49fd-ad21-2626c34768be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:29 np0005466012 nova_compute[192063]: 2025-10-02 12:16:29.130 2 DEBUG nova.network.neutron [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:29 np0005466012 nova_compute[192063]: 2025-10-02 12:16:29.164 2 DEBUG oslo_concurrency.lockutils [req-c91cd0bf-02e3-4a6a-80db-2dc498a3d996 req-247af80d-9822-4701-b126-8622d5a78324 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:29 np0005466012 nova_compute[192063]: 2025-10-02 12:16:29.165 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquired lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:29 np0005466012 nova_compute[192063]: 2025-10-02 12:16:29.165 2 DEBUG nova.network.neutron [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:29 np0005466012 nova_compute[192063]: 2025-10-02 12:16:29.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:30 np0005466012 podman[231133]: 2025-10-02 12:16:30.130469501 +0000 UTC m=+0.051801601 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:30 np0005466012 podman[231134]: 2025-10-02 12:16:30.130556293 +0000 UTC m=+0.049046205 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:16:30 np0005466012 nova_compute[192063]: 2025-10-02 12:16:30.904 2 DEBUG nova.network.neutron [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:30 np0005466012 nova_compute[192063]: 2025-10-02 12:16:30.926 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Releasing lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:30 np0005466012 nova_compute[192063]: 2025-10-02 12:16:30.936 2 DEBUG nova.compute.manager [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 kernel: tap58ade8f3-49 (unregistering): left promiscuous mode
Oct  2 08:16:31 np0005466012 NetworkManager[51207]: <info>  [1759407391.1625] device (tap58ade8f3-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:31Z|00287|binding|INFO|Releasing lport 58ade8f3-49a2-49fd-ad21-2626c34768be from this chassis (sb_readonly=0)
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:31Z|00288|binding|INFO|Setting lport 58ade8f3-49a2-49fd-ad21-2626c34768be down in Southbound
Oct  2 08:16:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:31Z|00289|binding|INFO|Removing iface tap58ade8f3-49 ovn-installed in OVS
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.187 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:8a:55 10.100.0.7'], port_security=['fa:16:3e:db:8a:55 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '52aaa8ad-df8d-46de-a710-4463776cfe6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ed7af923ad494ac5b7dbd3d8403dc33e', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'bd7b2d3a-1733-49ff-aecc-c24e23ffef02 f6f46a30-ca89-45c9-b4fd-d5c78d4ee0ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08fc185f-7900-4a64-ba36-f229e6cb956d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=58ade8f3-49a2-49fd-ad21-2626c34768be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.188 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 58ade8f3-49a2-49fd-ad21-2626c34768be in datapath 5716ac1c-acf7-48a7-8b93-dda3a5af31f6 unbound from our chassis#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.189 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5716ac1c-acf7-48a7-8b93-dda3a5af31f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.190 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fbad4ffc-9aa0-4e26-b356-763bc3f76a6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.192 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 namespace which is not needed anymore#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000050.scope: Deactivated successfully.
Oct  2 08:16:31 np0005466012 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000050.scope: Consumed 10.595s CPU time.
Oct  2 08:16:31 np0005466012 systemd-machined[152114]: Machine qemu-33-instance-00000050 terminated.
Oct  2 08:16:31 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [NOTICE]   (231075) : haproxy version is 2.8.14-c23fe91
Oct  2 08:16:31 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [NOTICE]   (231075) : path to executable is /usr/sbin/haproxy
Oct  2 08:16:31 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [WARNING]  (231075) : Exiting Master process...
Oct  2 08:16:31 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [WARNING]  (231075) : Exiting Master process...
Oct  2 08:16:31 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [ALERT]    (231075) : Current worker (231077) exited with code 143 (Terminated)
Oct  2 08:16:31 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231071]: [WARNING]  (231075) : All workers exited. Exiting... (0)
Oct  2 08:16:31 np0005466012 systemd[1]: libpod-58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37.scope: Deactivated successfully.
Oct  2 08:16:31 np0005466012 podman[231201]: 2025-10-02 12:16:31.366003932 +0000 UTC m=+0.059372359 container died 58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:16:31 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37-userdata-shm.mount: Deactivated successfully.
Oct  2 08:16:31 np0005466012 systemd[1]: var-lib-containers-storage-overlay-151ba586a76bd4b005d8814cc39280fd06dc8e344a8dedcdee40d0e8b7062773-merged.mount: Deactivated successfully.
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.410 2 INFO nova.virt.libvirt.driver [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance destroyed successfully.#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.410 2 DEBUG nova.objects.instance [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'resources' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:31 np0005466012 podman[231201]: 2025-10-02 12:16:31.411293831 +0000 UTC m=+0.104662258 container cleanup 58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.423 2 DEBUG nova.virt.libvirt.vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-992705125',display_name='tempest-SecurityGroupsTestJSON-server-992705125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-992705125',id=80,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ed7af923ad494ac5b7dbd3d8403dc33e',ramdisk_id='',reservation_id='r-imbi9e46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-431508526',owner_user_name='tempest-SecurityGroupsTestJSON-431508526-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:31Z,user_data=None,user_id='341760d37e2c44209429d234ca5f01ae',uuid=52aaa8ad-df8d-46de-a710-4463776cfe6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.423 2 DEBUG nova.network.os_vif_util [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converting VIF {"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.426 2 DEBUG nova.network.os_vif_util [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.428 2 DEBUG os_vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.429 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58ade8f3-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 systemd[1]: libpod-conmon-58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37.scope: Deactivated successfully.
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.435 2 INFO os_vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49')#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.440 2 DEBUG nova.virt.libvirt.driver [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Start _get_guest_xml network_info=[{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.443 2 WARNING nova.virt.libvirt.driver [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.447 2 DEBUG nova.virt.libvirt.host [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.448 2 DEBUG nova.virt.libvirt.host [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.451 2 DEBUG nova.virt.libvirt.host [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.451 2 DEBUG nova.virt.libvirt.host [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.452 2 DEBUG nova.virt.libvirt.driver [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.452 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.453 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.453 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.453 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.454 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.454 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.454 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.454 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.455 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.455 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.455 2 DEBUG nova.virt.hardware [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.456 2 DEBUG nova.objects.instance [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:31 np0005466012 podman[231248]: 2025-10-02 12:16:31.480309676 +0000 UTC m=+0.044695034 container remove 58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.486 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8d190e66-ed21-4966-8912-a55f36e276cf]: (4, ('Thu Oct  2 12:16:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 (58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37)\n58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37\nThu Oct  2 12:16:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 (58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37)\n58e79f88939d2b8ec749c0bbe0d0449fe6907aec5c8c5aca8df82d6c13f16f37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.487 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f46fcd27-9d34-4341-a947-7b5f8fb9b78e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.488 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5716ac1c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:31 np0005466012 kernel: tap5716ac1c-a0: left promiscuous mode
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.494 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b81ab181-21dd-4f8e-89f1-10ed67bf0a31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.507 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.535 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dee5d9c2-0f82-4169-997c-3451784050d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.536 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6622d5f2-5c1e-4a38-b55f-f2166051f95f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.553 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e318241f-a7d2-49b8-942d-0476a8e1af60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537407, 'reachable_time': 34182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231266, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.557 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:16:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:31.557 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4f46d1-b04a-4b33-8226-ad6d757a8eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:31 np0005466012 systemd[1]: run-netns-ovnmeta\x2d5716ac1c\x2dacf7\x2d48a7\x2d8b93\x2ddda3a5af31f6.mount: Deactivated successfully.
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.571 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.572 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.572 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.680 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.682 2 DEBUG nova.virt.libvirt.vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-992705125',display_name='tempest-SecurityGroupsTestJSON-server-992705125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-992705125',id=80,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ed7af923ad494ac5b7dbd3d8403dc33e',ramdisk_id='',reservation_id='r-imbi9e46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-431508526',owner_user_name='tempest-SecurityGroupsTestJSON-431508526-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:31Z,user_data=None,user_id='341760d37e2c44209429d234ca5f01ae',uuid=52aaa8ad-df8d-46de-a710-4463776cfe6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.682 2 DEBUG nova.network.os_vif_util [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converting VIF {"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.683 2 DEBUG nova.network.os_vif_util [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.684 2 DEBUG nova.objects.instance [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'pci_devices' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.701 2 DEBUG nova.virt.libvirt.driver [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <uuid>52aaa8ad-df8d-46de-a710-4463776cfe6a</uuid>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <name>instance-00000050</name>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:name>tempest-SecurityGroupsTestJSON-server-992705125</nova:name>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:16:31</nova:creationTime>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:user uuid="341760d37e2c44209429d234ca5f01ae">tempest-SecurityGroupsTestJSON-431508526-project-member</nova:user>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:project uuid="ed7af923ad494ac5b7dbd3d8403dc33e">tempest-SecurityGroupsTestJSON-431508526</nova:project>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        <nova:port uuid="58ade8f3-49a2-49fd-ad21-2626c34768be">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <entry name="serial">52aaa8ad-df8d-46de-a710-4463776cfe6a</entry>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <entry name="uuid">52aaa8ad-df8d-46de-a710-4463776cfe6a</entry>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk.config"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:db:8a:55"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <target dev="tap58ade8f3-49"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/console.log" append="off"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:16:31 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:16:31 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:16:31 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:16:31 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.702 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.760 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.762 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.825 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.829 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.830 2 DEBUG nova.objects.instance [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.845 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.845 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.845 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.853 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.893 2 DEBUG nova.compute.manager [req-c2303166-8a2f-4118-b1be-690f8dea7deb req-3256981c-5d55-403a-8817-d08e75ab2fed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-unplugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.894 2 DEBUG oslo_concurrency.lockutils [req-c2303166-8a2f-4118-b1be-690f8dea7deb req-3256981c-5d55-403a-8817-d08e75ab2fed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.894 2 DEBUG oslo_concurrency.lockutils [req-c2303166-8a2f-4118-b1be-690f8dea7deb req-3256981c-5d55-403a-8817-d08e75ab2fed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.895 2 DEBUG oslo_concurrency.lockutils [req-c2303166-8a2f-4118-b1be-690f8dea7deb req-3256981c-5d55-403a-8817-d08e75ab2fed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.895 2 DEBUG nova.compute.manager [req-c2303166-8a2f-4118-b1be-690f8dea7deb req-3256981c-5d55-403a-8817-d08e75ab2fed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-unplugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.895 2 WARNING nova.compute.manager [req-c2303166-8a2f-4118-b1be-690f8dea7deb req-3256981c-5d55-403a-8817-d08e75ab2fed 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received unexpected event network-vif-unplugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.934 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.935 2 DEBUG nova.virt.disk.api [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Checking if we can resize image /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:31 np0005466012 nova_compute[192063]: 2025-10-02 12:16:31.935 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.012 2 DEBUG oslo_concurrency.processutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.013 2 DEBUG nova.virt.disk.api [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Cannot resize image /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.014 2 DEBUG nova.objects.instance [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'migration_context' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.040 2 DEBUG nova.virt.libvirt.vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-992705125',display_name='tempest-SecurityGroupsTestJSON-server-992705125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-992705125',id=80,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='ed7af923ad494ac5b7dbd3d8403dc33e',ramdisk_id='',reservation_id='r-imbi9e46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-431508526',owner_user_name='tempest-SecurityGroupsTestJSON-431508526-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:31Z,user_data=None,user_id='341760d37e2c44209429d234ca5f01ae',uuid=52aaa8ad-df8d-46de-a710-4463776cfe6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.041 2 DEBUG nova.network.os_vif_util [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converting VIF {"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.042 2 DEBUG nova.network.os_vif_util [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.044 2 DEBUG os_vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.046 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.046 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.051 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58ade8f3-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.052 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58ade8f3-49, col_values=(('external_ids', {'iface-id': '58ade8f3-49a2-49fd-ad21-2626c34768be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:8a:55', 'vm-uuid': '52aaa8ad-df8d-46de-a710-4463776cfe6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.0544] manager: (tap58ade8f3-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.060 2 INFO os_vif [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49')#033[00m
Oct  2 08:16:32 np0005466012 kernel: tap58ade8f3-49: entered promiscuous mode
Oct  2 08:16:32 np0005466012 systemd-udevd[231178]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.2537] manager: (tap58ade8f3-49): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Oct  2 08:16:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:32Z|00290|binding|INFO|Claiming lport 58ade8f3-49a2-49fd-ad21-2626c34768be for this chassis.
Oct  2 08:16:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:32Z|00291|binding|INFO|58ade8f3-49a2-49fd-ad21-2626c34768be: Claiming fa:16:3e:db:8a:55 10.100.0.7
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.2543] device (tap58ade8f3-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.2550] device (tap58ade8f3-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.261 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:8a:55 10.100.0.7'], port_security=['fa:16:3e:db:8a:55 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '52aaa8ad-df8d-46de-a710-4463776cfe6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ed7af923ad494ac5b7dbd3d8403dc33e', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'bd7b2d3a-1733-49ff-aecc-c24e23ffef02 f6f46a30-ca89-45c9-b4fd-d5c78d4ee0ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08fc185f-7900-4a64-ba36-f229e6cb956d, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=58ade8f3-49a2-49fd-ad21-2626c34768be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.264 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 58ade8f3-49a2-49fd-ad21-2626c34768be in datapath 5716ac1c-acf7-48a7-8b93-dda3a5af31f6 bound to our chassis#033[00m
Oct  2 08:16:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:32Z|00292|binding|INFO|Setting lport 58ade8f3-49a2-49fd-ad21-2626c34768be ovn-installed in OVS
Oct  2 08:16:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:32Z|00293|binding|INFO|Setting lport 58ade8f3-49a2-49fd-ad21-2626c34768be up in Southbound
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.266 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5716ac1c-acf7-48a7-8b93-dda3a5af31f6#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.278 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[004bfb78-230e-4a36-becc-a431295d4d95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.279 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5716ac1c-a1 in ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.282 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5716ac1c-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.282 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[55fc9def-24dd-4458-83e0-a4403d3420a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.283 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d77d6516-d7bb-4ac3-a361-8a2ed1bcc13e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 systemd-machined[152114]: New machine qemu-34-instance-00000050.
Oct  2 08:16:32 np0005466012 systemd[1]: Started Virtual Machine qemu-34-instance-00000050.
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.293 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[b46d97bc-d758-45e4-a824-71326ca4d6a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.311 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[44bcec13-21cf-4401-872b-ae3bd7a35616]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.337 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7f06ab59-594b-42b7-bb9a-da9396ecd9e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.342 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5168af9a-66b3-4b06-a4e1-a9c40480e447]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.3439] manager: (tap5716ac1c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.373 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9f5bb6-6a05-4ad1-87a9-da7d91d050d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.376 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[184b4734-a72d-4b6d-832b-80c24d91ace4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.3962] device (tap5716ac1c-a0): carrier: link connected
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.402 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba06f75-d327-43b6-b490-972a263cf07a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.417 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f6772d55-3157-402d-8b84-b27612d6fb8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5716ac1c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f8:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538600, 'reachable_time': 16732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231327, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.431 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[426b2ba7-408c-4101-96b2-a4d2bfb161c8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:f83e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538600, 'tstamp': 538600}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231328, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.445 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a5883bd1-fdcf-4aea-af88-195ec51b1fd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5716ac1c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f8:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538600, 'reachable_time': 16732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231329, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.481 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf4bebb-8d73-49aa-9a8c-09a5b9ccfd3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.541 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d739fc50-d840-4352-a46a-70f51885ffca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.542 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5716ac1c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.543 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.543 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5716ac1c-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:32 np0005466012 NetworkManager[51207]: <info>  [1759407392.5454] manager: (tap5716ac1c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Oct  2 08:16:32 np0005466012 kernel: tap5716ac1c-a0: entered promiscuous mode
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.547 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5716ac1c-a0, col_values=(('external_ids', {'iface-id': 'cc8e73bf-6cd9-4487-9685-abdace89cf29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:32Z|00294|binding|INFO|Releasing lport cc8e73bf-6cd9-4487-9685-abdace89cf29 from this chassis (sb_readonly=0)
Oct  2 08:16:32 np0005466012 nova_compute[192063]: 2025-10-02 12:16:32.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.560 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.561 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8bcdc2e5-db19-48b7-ada1-841d5d3332f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.562 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-5716ac1c-acf7-48a7-8b93-dda3a5af31f6
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.pid.haproxy
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 5716ac1c-acf7-48a7-8b93-dda3a5af31f6
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:16:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:32.564 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'env', 'PROCESS_TAG=haproxy-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5716ac1c-acf7-48a7-8b93-dda3a5af31f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:16:32 np0005466012 podman[231368]: 2025-10-02 12:16:32.980755096 +0000 UTC m=+0.083272099 container create 03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:33 np0005466012 podman[231368]: 2025-10-02 12:16:32.924996177 +0000 UTC m=+0.027513240 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:16:33 np0005466012 systemd[1]: Started libpod-conmon-03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e.scope.
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.044 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 52aaa8ad-df8d-46de-a710-4463776cfe6a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.045 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407393.0442805, 52aaa8ad-df8d-46de-a710-4463776cfe6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.045 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:33 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.054 2 DEBUG nova.compute.manager [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.059 2 INFO nova.virt.libvirt.driver [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance rebooted successfully.#033[00m
Oct  2 08:16:33 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcd70765f32ec04825846196f70f722716e718ae39ee57c07ea896934ae76e11/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.060 2 DEBUG nova.compute.manager [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.068 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.075 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:33 np0005466012 podman[231368]: 2025-10-02 12:16:33.07550151 +0000 UTC m=+0.178018533 container init 03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:16:33 np0005466012 podman[231368]: 2025-10-02 12:16:33.081772413 +0000 UTC m=+0.184289416 container start 03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:16:33 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [NOTICE]   (231387) : New worker (231389) forked
Oct  2 08:16:33 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [NOTICE]   (231387) : Loading success.
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.117 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.117 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407393.0515003, 52aaa8ad-df8d-46de-a710-4463776cfe6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.118 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.143 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.147 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:33 np0005466012 nova_compute[192063]: 2025-10-02 12:16:33.173 2 DEBUG oslo_concurrency.lockutils [None req-440c6427-bfb8-4216-baaf-c387ffa0c39e 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.066 2 DEBUG nova.compute.manager [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.066 2 DEBUG oslo_concurrency.lockutils [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.066 2 DEBUG oslo_concurrency.lockutils [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.067 2 DEBUG oslo_concurrency.lockutils [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.067 2 DEBUG nova.compute.manager [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.067 2 WARNING nova.compute.manager [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received unexpected event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.067 2 DEBUG nova.compute.manager [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.067 2 DEBUG oslo_concurrency.lockutils [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.068 2 DEBUG oslo_concurrency.lockutils [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.068 2 DEBUG oslo_concurrency.lockutils [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.068 2 DEBUG nova.compute.manager [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:34 np0005466012 nova_compute[192063]: 2025-10-02 12:16:34.068 2 WARNING nova.compute.manager [req-f8cb89ab-ea7f-4b93-ae6b-c32097c385ff req-15321c27-dca2-41e2-bce0-9370313242c2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received unexpected event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:35 np0005466012 nova_compute[192063]: 2025-10-02 12:16:35.205 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:35 np0005466012 nova_compute[192063]: 2025-10-02 12:16:35.225 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:35 np0005466012 nova_compute[192063]: 2025-10-02 12:16:35.225 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.180 2 DEBUG nova.compute.manager [req-661c0b4c-e296-4a64-9a8b-58d448f68c1a req-cedf0cb4-6064-49ec-91b9-d1f9ecb52cca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.180 2 DEBUG oslo_concurrency.lockutils [req-661c0b4c-e296-4a64-9a8b-58d448f68c1a req-cedf0cb4-6064-49ec-91b9-d1f9ecb52cca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.181 2 DEBUG oslo_concurrency.lockutils [req-661c0b4c-e296-4a64-9a8b-58d448f68c1a req-cedf0cb4-6064-49ec-91b9-d1f9ecb52cca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.181 2 DEBUG oslo_concurrency.lockutils [req-661c0b4c-e296-4a64-9a8b-58d448f68c1a req-cedf0cb4-6064-49ec-91b9-d1f9ecb52cca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.181 2 DEBUG nova.compute.manager [req-661c0b4c-e296-4a64-9a8b-58d448f68c1a req-cedf0cb4-6064-49ec-91b9-d1f9ecb52cca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.182 2 WARNING nova.compute.manager [req-661c0b4c-e296-4a64-9a8b-58d448f68c1a req-cedf0cb4-6064-49ec-91b9-d1f9ecb52cca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received unexpected event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with vm_state active and task_state None.#033[00m
Oct  2 08:16:36 np0005466012 nova_compute[192063]: 2025-10-02 12:16:36.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:37 np0005466012 nova_compute[192063]: 2025-10-02 12:16:37.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.275 2 DEBUG nova.compute.manager [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-changed-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.276 2 DEBUG nova.compute.manager [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Refreshing instance network info cache due to event network-changed-58ade8f3-49a2-49fd-ad21-2626c34768be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.276 2 DEBUG oslo_concurrency.lockutils [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.276 2 DEBUG oslo_concurrency.lockutils [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.276 2 DEBUG nova.network.neutron [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Refreshing network info cache for port 58ade8f3-49a2-49fd-ad21-2626c34768be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.698 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.698 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.699 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.699 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.699 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.763 2 INFO nova.compute.manager [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Terminating instance#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.810 2 DEBUG nova.compute.manager [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:16:38 np0005466012 kernel: tap58ade8f3-49 (unregistering): left promiscuous mode
Oct  2 08:16:38 np0005466012 NetworkManager[51207]: <info>  [1759407398.8409] device (tap58ade8f3-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:38Z|00295|binding|INFO|Releasing lport 58ade8f3-49a2-49fd-ad21-2626c34768be from this chassis (sb_readonly=0)
Oct  2 08:16:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:38Z|00296|binding|INFO|Setting lport 58ade8f3-49a2-49fd-ad21-2626c34768be down in Southbound
Oct  2 08:16:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:38Z|00297|binding|INFO|Removing iface tap58ade8f3-49 ovn-installed in OVS
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:38 np0005466012 nova_compute[192063]: 2025-10-02 12:16:38.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:38 np0005466012 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Deactivated successfully.
Oct  2 08:16:38 np0005466012 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Consumed 6.745s CPU time.
Oct  2 08:16:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:38.894 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:8a:55 10.100.0.7'], port_security=['fa:16:3e:db:8a:55 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '52aaa8ad-df8d-46de-a710-4463776cfe6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ed7af923ad494ac5b7dbd3d8403dc33e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '40790a8d-32eb-4879-bca6-24ae61169432 bd7b2d3a-1733-49ff-aecc-c24e23ffef02 f6f46a30-ca89-45c9-b4fd-d5c78d4ee0ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08fc185f-7900-4a64-ba36-f229e6cb956d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=58ade8f3-49a2-49fd-ad21-2626c34768be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:38.896 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 58ade8f3-49a2-49fd-ad21-2626c34768be in datapath 5716ac1c-acf7-48a7-8b93-dda3a5af31f6 unbound from our chassis#033[00m
Oct  2 08:16:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:38.897 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5716ac1c-acf7-48a7-8b93-dda3a5af31f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:16:38 np0005466012 systemd-machined[152114]: Machine qemu-34-instance-00000050 terminated.
Oct  2 08:16:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:38.898 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa5bd5a-3a06-45ab-89e0-17e36730b4c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:38.899 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 namespace which is not needed anymore#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.063 2 INFO nova.virt.libvirt.driver [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Instance destroyed successfully.#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.064 2 DEBUG nova.objects.instance [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lazy-loading 'resources' on Instance uuid 52aaa8ad-df8d-46de-a710-4463776cfe6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.115 2 DEBUG nova.virt.libvirt.vif [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-992705125',display_name='tempest-SecurityGroupsTestJSON-server-992705125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-992705125',id=80,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ed7af923ad494ac5b7dbd3d8403dc33e',ramdisk_id='',reservation_id='r-imbi9e46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-431508526',owner_user_name='tempest-SecurityGroupsTestJSON-431508526-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:16:33Z,user_data=None,user_id='341760d37e2c44209429d234ca5f01ae',uuid=52aaa8ad-df8d-46de-a710-4463776cfe6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.115 2 DEBUG nova.network.os_vif_util [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converting VIF {"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.116 2 DEBUG nova.network.os_vif_util [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.116 2 DEBUG os_vif [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.118 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58ade8f3-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.124 2 INFO os_vif [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:8a:55,bridge_name='br-int',has_traffic_filtering=True,id=58ade8f3-49a2-49fd-ad21-2626c34768be,network=Network(5716ac1c-acf7-48a7-8b93-dda3a5af31f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58ade8f3-49')#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.124 2 INFO nova.virt.libvirt.driver [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Deleting instance files /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a_del#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.125 2 INFO nova.virt.libvirt.driver [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Deletion of /var/lib/nova/instances/52aaa8ad-df8d-46de-a710-4463776cfe6a_del complete#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.276 2 INFO nova.compute.manager [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.277 2 DEBUG oslo.service.loopingcall [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.277 2 DEBUG nova.compute.manager [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:16:39 np0005466012 nova_compute[192063]: 2025-10-02 12:16:39.278 2 DEBUG nova.network.neutron [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:16:39 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [NOTICE]   (231387) : haproxy version is 2.8.14-c23fe91
Oct  2 08:16:39 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [NOTICE]   (231387) : path to executable is /usr/sbin/haproxy
Oct  2 08:16:39 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [WARNING]  (231387) : Exiting Master process...
Oct  2 08:16:39 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [ALERT]    (231387) : Current worker (231389) exited with code 143 (Terminated)
Oct  2 08:16:39 np0005466012 neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6[231383]: [WARNING]  (231387) : All workers exited. Exiting... (0)
Oct  2 08:16:39 np0005466012 systemd[1]: libpod-03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e.scope: Deactivated successfully.
Oct  2 08:16:39 np0005466012 podman[231422]: 2025-10-02 12:16:39.317618862 +0000 UTC m=+0.337641177 container died 03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:16:39 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:16:39 np0005466012 systemd[1]: var-lib-containers-storage-overlay-bcd70765f32ec04825846196f70f722716e718ae39ee57c07ea896934ae76e11-merged.mount: Deactivated successfully.
Oct  2 08:16:39 np0005466012 podman[231422]: 2025-10-02 12:16:39.887442235 +0000 UTC m=+0.907464550 container cleanup 03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.031 2 DEBUG nova.network.neutron [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updated VIF entry in instance network info cache for port 58ade8f3-49a2-49fd-ad21-2626c34768be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.032 2 DEBUG nova.network.neutron [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [{"id": "58ade8f3-49a2-49fd-ad21-2626c34768be", "address": "fa:16:3e:db:8a:55", "network": {"id": "5716ac1c-acf7-48a7-8b93-dda3a5af31f6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1571059342-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed7af923ad494ac5b7dbd3d8403dc33e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58ade8f3-49", "ovs_interfaceid": "58ade8f3-49a2-49fd-ad21-2626c34768be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.055 2 DEBUG oslo_concurrency.lockutils [req-269909d5-374e-4770-84ef-0ecac1c741aa req-ced2b64d-cd6e-4d48-8319-982e5f180e49 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-52aaa8ad-df8d-46de-a710-4463776cfe6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:40 np0005466012 podman[231469]: 2025-10-02 12:16:40.246360137 +0000 UTC m=+0.337846022 container remove 03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.254 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9869014d-a290-4a52-863a-e40290c77a2a]: (4, ('Thu Oct  2 12:16:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 (03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e)\n03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e\nThu Oct  2 12:16:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 (03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e)\n03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.256 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6baa15-f884-49a3-bb0c-d55ce8eab6ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.257 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5716ac1c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005466012 kernel: tap5716ac1c-a0: left promiscuous mode
Oct  2 08:16:40 np0005466012 systemd[1]: libpod-conmon-03166d75ad8b0b8ba79c5bf85e2d0edb3d93a594f85673a0c35b0d189e7ffb0e.scope: Deactivated successfully.
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.272 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[77f7e53f-87f5-4beb-ab6d-e87cc4222263]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.299 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bae89681-b843-4579-85e9-22e518c944de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.300 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[503323a6-103d-462f-aca3-d70f37abc695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.317 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[741c3d03-8618-4522-a10e-edce1b805fc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538594, 'reachable_time': 16726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231487, 'error': None, 'target': 'ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.319 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5716ac1c-acf7-48a7-8b93-dda3a5af31f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.320 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[13e5ffd8-9685-465d-bacc-a924179aa350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:40 np0005466012 systemd[1]: run-netns-ovnmeta\x2d5716ac1c\x2dacf7\x2d48a7\x2d8b93\x2ddda3a5af31f6.mount: Deactivated successfully.
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.487 2 DEBUG nova.compute.manager [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-unplugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.488 2 DEBUG oslo_concurrency.lockutils [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.488 2 DEBUG oslo_concurrency.lockutils [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.488 2 DEBUG oslo_concurrency.lockutils [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.489 2 DEBUG nova.compute.manager [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-unplugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.489 2 DEBUG nova.compute.manager [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-unplugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.489 2 DEBUG nova.compute.manager [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.490 2 DEBUG oslo_concurrency.lockutils [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.490 2 DEBUG oslo_concurrency.lockutils [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.490 2 DEBUG oslo_concurrency.lockutils [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.491 2 DEBUG nova.compute.manager [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] No waiting events found dispatching network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.491 2 WARNING nova.compute.manager [req-e6df6192-0386-4eb1-b203-b4257855f899 req-fa7d1241-30e3-4f83-87d9-144e735d2b41 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received unexpected event network-vif-plugged-58ade8f3-49a2-49fd-ad21-2626c34768be for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.727 2 DEBUG nova.network.neutron [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.749 2 INFO nova.compute.manager [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Took 1.47 seconds to deallocate network for instance.#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.808 2 DEBUG nova.compute.manager [req-5e638256-411d-4cd7-8d3e-0056e1cb256e req-38241a71-2400-4b80-b119-24ecdc76fb48 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Received event network-vif-deleted-58ade8f3-49a2-49fd-ad21-2626c34768be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.824 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.825 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.901 2 DEBUG nova.compute.provider_tree [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.920 2 DEBUG nova.scheduler.client.report [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.941 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:40.943 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.944 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:40 np0005466012 nova_compute[192063]: 2025-10-02 12:16:40.978 2 INFO nova.scheduler.client.report [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Deleted allocations for instance 52aaa8ad-df8d-46de-a710-4463776cfe6a#033[00m
Oct  2 08:16:41 np0005466012 nova_compute[192063]: 2025-10-02 12:16:41.072 2 DEBUG oslo_concurrency.lockutils [None req-65a26f5a-46a4-4a03-bdcd-e9c1e57924e7 341760d37e2c44209429d234ca5f01ae ed7af923ad494ac5b7dbd3d8403dc33e - - default default] Lock "52aaa8ad-df8d-46de-a710-4463776cfe6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:41 np0005466012 nova_compute[192063]: 2025-10-02 12:16:41.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:41 np0005466012 nova_compute[192063]: 2025-10-02 12:16:41.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.250 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.251 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.266 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.367 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.367 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.372 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.372 2 INFO nova.compute.claims [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.512 2 DEBUG nova.compute.provider_tree [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.531 2 DEBUG nova.scheduler.client.report [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.566 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.567 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.620 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.620 2 DEBUG nova.network.neutron [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.647 2 INFO nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.677 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.830 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.832 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.833 2 INFO nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Creating image(s)#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.834 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.835 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.836 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.862 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.887 2 DEBUG nova.policy [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.930 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.931 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.932 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:42 np0005466012 nova_compute[192063]: 2025-10-02 12:16:42.956 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.033 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.034 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.207 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk 1073741824" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.208 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.209 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.260 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.261 2 DEBUG nova.virt.disk.api [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Checking if we can resize image /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.262 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.315 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.316 2 DEBUG nova.virt.disk.api [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Cannot resize image /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.316 2 DEBUG nova.objects.instance [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'migration_context' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.334 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.335 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Ensure instance console log exists: /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.335 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.335 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.335 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:43 np0005466012 nova_compute[192063]: 2025-10-02 12:16:43.876 2 DEBUG nova.network.neutron [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Successfully created port: d0f247a6-f336-4b2f-a423-7b03a80a5228 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:44 np0005466012 nova_compute[192063]: 2025-10-02 12:16:44.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:44 np0005466012 podman[231503]: 2025-10-02 12:16:44.158746537 +0000 UTC m=+0.066485734 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:16:44 np0005466012 podman[231504]: 2025-10-02 12:16:44.173075103 +0000 UTC m=+0.089162691 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:16:44 np0005466012 nova_compute[192063]: 2025-10-02 12:16:44.838 2 DEBUG nova.network.neutron [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Successfully updated port: d0f247a6-f336-4b2f-a423-7b03a80a5228 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:44 np0005466012 nova_compute[192063]: 2025-10-02 12:16:44.901 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:44 np0005466012 nova_compute[192063]: 2025-10-02 12:16:44.901 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:44 np0005466012 nova_compute[192063]: 2025-10-02 12:16:44.902 2 DEBUG nova.network.neutron [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:45 np0005466012 nova_compute[192063]: 2025-10-02 12:16:45.225 2 DEBUG nova.compute.manager [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:45 np0005466012 nova_compute[192063]: 2025-10-02 12:16:45.225 2 DEBUG nova.compute.manager [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing instance network info cache due to event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:45 np0005466012 nova_compute[192063]: 2025-10-02 12:16:45.225 2 DEBUG oslo_concurrency.lockutils [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:45 np0005466012 nova_compute[192063]: 2025-10-02 12:16:45.226 2 DEBUG nova.network.neutron [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.381 2 DEBUG nova.network.neutron [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.556 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.556 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance network_info: |[{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.557 2 DEBUG oslo_concurrency.lockutils [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.558 2 DEBUG nova.network.neutron [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.563 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Start _get_guest_xml network_info=[{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.568 2 WARNING nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.574 2 DEBUG nova.virt.libvirt.host [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.575 2 DEBUG nova.virt.libvirt.host [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.580 2 DEBUG nova.virt.libvirt.host [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.580 2 DEBUG nova.virt.libvirt.host [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.582 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.583 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.583 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.584 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.584 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.584 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.585 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.585 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.586 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.586 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.586 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.586 2 DEBUG nova.virt.hardware [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.591 2 DEBUG nova.virt.libvirt.vif [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-354540198',display_name='tempest-ServerRescueTestJSONUnderV235-server-354540198',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-354540198',id=84,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='999a48d299e548dfa3ec622cf07f7017',ramdisk_id='',reservation_id='r-rovb30q3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-125577537',owner_user_name='tempest-ServerRescueTestJSONUnderV235-125577537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:42Z,user_data=None,user_id='ebfa5d25510d4f4f8b7c9a6cf0b8c9b1',uuid=c0859f36-89c4-4534-aba5-d5373464c64f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.591 2 DEBUG nova.network.os_vif_util [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converting VIF {"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.592 2 DEBUG nova.network.os_vif_util [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.593 2 DEBUG nova.objects.instance [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'pci_devices' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.633 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <uuid>c0859f36-89c4-4534-aba5-d5373464c64f</uuid>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <name>instance-00000054</name>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-354540198</nova:name>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:16:46</nova:creationTime>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:user uuid="ebfa5d25510d4f4f8b7c9a6cf0b8c9b1">tempest-ServerRescueTestJSONUnderV235-125577537-project-member</nova:user>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:project uuid="999a48d299e548dfa3ec622cf07f7017">tempest-ServerRescueTestJSONUnderV235-125577537</nova:project>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        <nova:port uuid="d0f247a6-f336-4b2f-a423-7b03a80a5228">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <entry name="serial">c0859f36-89c4-4534-aba5-d5373464c64f</entry>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <entry name="uuid">c0859f36-89c4-4534-aba5-d5373464c64f</entry>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:5a:a3:d0"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <target dev="tapd0f247a6-f3"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/console.log" append="off"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:16:46 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:16:46 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:16:46 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:16:46 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.634 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Preparing to wait for external event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.635 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.635 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.635 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.636 2 DEBUG nova.virt.libvirt.vif [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-354540198',display_name='tempest-ServerRescueTestJSONUnderV235-server-354540198',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-354540198',id=84,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='999a48d299e548dfa3ec622cf07f7017',ramdisk_id='',reservation_id='r-rovb30q3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-125577537',owner_user_name='tempest-ServerRescueTestJSONUnderV235-125577537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:42Z,user_data=None,user_id='ebfa5d25510d4f4f8b7c9a6cf0b8c9b1',uuid=c0859f36-89c4-4534-aba5-d5373464c64f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.637 2 DEBUG nova.network.os_vif_util [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converting VIF {"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.638 2 DEBUG nova.network.os_vif_util [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.638 2 DEBUG os_vif [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0f247a6-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd0f247a6-f3, col_values=(('external_ids', {'iface-id': 'd0f247a6-f336-4b2f-a423-7b03a80a5228', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:a3:d0', 'vm-uuid': 'c0859f36-89c4-4534-aba5-d5373464c64f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:46 np0005466012 NetworkManager[51207]: <info>  [1759407406.6486] manager: (tapd0f247a6-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.654 2 INFO os_vif [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3')#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.859 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.859 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.863 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No VIF found with MAC fa:16:3e:5a:a3:d0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:46 np0005466012 nova_compute[192063]: 2025-10-02 12:16:46.864 2 INFO nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Using config drive#033[00m
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.442 2 INFO nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Creating config drive at /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config#033[00m
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.452 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpegs02dxd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.578 2 DEBUG oslo_concurrency.processutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpegs02dxd" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:47 np0005466012 kernel: tapd0f247a6-f3: entered promiscuous mode
Oct  2 08:16:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:47Z|00298|binding|INFO|Claiming lport d0f247a6-f336-4b2f-a423-7b03a80a5228 for this chassis.
Oct  2 08:16:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:47Z|00299|binding|INFO|d0f247a6-f336-4b2f-a423-7b03a80a5228: Claiming fa:16:3e:5a:a3:d0 10.100.0.10
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:47 np0005466012 NetworkManager[51207]: <info>  [1759407407.6625] manager: (tapd0f247a6-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:47.682 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:a3:d0 10.100.0.10'], port_security=['fa:16:3e:5a:a3:d0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3cafb9b-5ea3-48cb-b4f5-616692db21f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '999a48d299e548dfa3ec622cf07f7017', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b61602f4-509b-434a-8fef-8040306ea771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6778b39e-f647-49e5-a839-dee5291ea3a3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d0f247a6-f336-4b2f-a423-7b03a80a5228) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:16:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:47.684 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d0f247a6-f336-4b2f-a423-7b03a80a5228 in datapath d3cafb9b-5ea3-48cb-b4f5-616692db21f6 bound to our chassis#033[00m
Oct  2 08:16:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:47.685 103246 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d3cafb9b-5ea3-48cb-b4f5-616692db21f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:16:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:47.687 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f11c34-0878-499f-a94b-0fac828feaf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:16:47 np0005466012 systemd-udevd[231584]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:16:47 np0005466012 systemd-machined[152114]: New machine qemu-35-instance-00000054.
Oct  2 08:16:47 np0005466012 systemd[1]: Started Virtual Machine qemu-35-instance-00000054.
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:47 np0005466012 NetworkManager[51207]: <info>  [1759407407.7254] device (tapd0f247a6-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:16:47 np0005466012 NetworkManager[51207]: <info>  [1759407407.7262] device (tapd0f247a6-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:16:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:47Z|00300|binding|INFO|Setting lport d0f247a6-f336-4b2f-a423-7b03a80a5228 ovn-installed in OVS
Oct  2 08:16:47 np0005466012 podman[231563]: 2025-10-02 12:16:47.725926083 +0000 UTC m=+0.070086794 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:16:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:16:47Z|00301|binding|INFO|Setting lport d0f247a6-f336-4b2f-a423-7b03a80a5228 up in Southbound
Oct  2 08:16:47 np0005466012 nova_compute[192063]: 2025-10-02 12:16:47.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.025 2 DEBUG nova.compute.manager [req-9af4a7f9-6d00-4116-b5f3-f507a3928492 req-0abf0640-9934-49bf-b804-4f0cf6df2a01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.026 2 DEBUG oslo_concurrency.lockutils [req-9af4a7f9-6d00-4116-b5f3-f507a3928492 req-0abf0640-9934-49bf-b804-4f0cf6df2a01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.026 2 DEBUG oslo_concurrency.lockutils [req-9af4a7f9-6d00-4116-b5f3-f507a3928492 req-0abf0640-9934-49bf-b804-4f0cf6df2a01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.026 2 DEBUG oslo_concurrency.lockutils [req-9af4a7f9-6d00-4116-b5f3-f507a3928492 req-0abf0640-9934-49bf-b804-4f0cf6df2a01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.026 2 DEBUG nova.compute.manager [req-9af4a7f9-6d00-4116-b5f3-f507a3928492 req-0abf0640-9934-49bf-b804-4f0cf6df2a01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Processing event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.152 2 DEBUG nova.network.neutron [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updated VIF entry in instance network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.153 2 DEBUG nova.network.neutron [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:48 np0005466012 nova_compute[192063]: 2025-10-02 12:16:48.175 2 DEBUG oslo_concurrency.lockutils [req-eb37f211-2e99-4990-92d5-0f73e7a4a42c req-7ddc4c17-1f28-4a52-9349-bf48796ee0b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:50 np0005466012 podman[231602]: 2025-10-02 12:16:50.062601497 +0000 UTC m=+0.097269725 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.146 2 DEBUG nova.compute.manager [req-b5583204-b3e0-46f7-bae9-8f16c2a3980e req-00334b89-b381-438d-8061-0dd0aee3c987 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.146 2 DEBUG oslo_concurrency.lockutils [req-b5583204-b3e0-46f7-bae9-8f16c2a3980e req-00334b89-b381-438d-8061-0dd0aee3c987 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.147 2 DEBUG oslo_concurrency.lockutils [req-b5583204-b3e0-46f7-bae9-8f16c2a3980e req-00334b89-b381-438d-8061-0dd0aee3c987 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.147 2 DEBUG oslo_concurrency.lockutils [req-b5583204-b3e0-46f7-bae9-8f16c2a3980e req-00334b89-b381-438d-8061-0dd0aee3c987 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.147 2 DEBUG nova.compute.manager [req-b5583204-b3e0-46f7-bae9-8f16c2a3980e req-00334b89-b381-438d-8061-0dd0aee3c987 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.147 2 WARNING nova.compute.manager [req-b5583204-b3e0-46f7-bae9-8f16c2a3980e req-00334b89-b381-438d-8061-0dd0aee3c987 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received unexpected event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.437 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407410.4372165, c0859f36-89c4-4534-aba5-d5373464c64f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.438 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] VM Started (Lifecycle Event)#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.439 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.449 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.455 2 INFO nova.virt.libvirt.driver [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance spawned successfully.#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.455 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.457 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.462 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.477 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.478 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.478 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.479 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.480 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.481 2 DEBUG nova.virt.libvirt.driver [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.487 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.487 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407410.437427, c0859f36-89c4-4534-aba5-d5373464c64f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.488 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.513 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.518 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407410.44386, c0859f36-89c4-4534-aba5-d5373464c64f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.518 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.537 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.541 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.564 2 INFO nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Took 7.73 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.565 2 DEBUG nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.570 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.758 2 INFO nova.compute.manager [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Took 8.42 seconds to build instance.#033[00m
Oct  2 08:16:50 np0005466012 nova_compute[192063]: 2025-10-02 12:16:50.821 2 DEBUG oslo_concurrency.lockutils [None req-66b06632-3aa7-474b-895c-0b3d740e2db8 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:16:50.946 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:51 np0005466012 nova_compute[192063]: 2025-10-02 12:16:51.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:51 np0005466012 nova_compute[192063]: 2025-10-02 12:16:51.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.259 2 INFO nova.compute.manager [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Rescuing#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.260 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.260 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.260 2 DEBUG nova.network.neutron [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.461 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.462 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.483 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.585 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.585 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.591 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.592 2 INFO nova.compute.claims [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.780 2 DEBUG nova.compute.provider_tree [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.798 2 DEBUG nova.scheduler.client.report [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.821 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.822 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.875 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.876 2 DEBUG nova.network.neutron [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.896 2 INFO nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:16:53 np0005466012 nova_compute[192063]: 2025-10-02 12:16:53.944 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.061 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407399.060986, 52aaa8ad-df8d-46de-a710-4463776cfe6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.062 2 INFO nova.compute.manager [-] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.083 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.084 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.084 2 INFO nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Creating image(s)#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.085 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "/var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.085 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "/var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.086 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "/var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.099 2 DEBUG nova.policy [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.101 2 DEBUG nova.compute.manager [None req-75e4dd18-9945-438a-9243-c1b13c663620 - - - - - -] [instance: 52aaa8ad-df8d-46de-a710-4463776cfe6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.103 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.159 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.160 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.161 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.171 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.245 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.247 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.510 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk 1073741824" returned: 0 in 0.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.510 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.511 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.600 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.601 2 DEBUG nova.virt.disk.api [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Checking if we can resize image /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.601 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.667 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.668 2 DEBUG nova.virt.disk.api [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Cannot resize image /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.669 2 DEBUG nova.objects.instance [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lazy-loading 'migration_context' on Instance uuid 6f5eb8d8-6d7c-4666-ace5-49baf3909221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.691 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.691 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Ensure instance console log exists: /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.692 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.692 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.692 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.770 2 DEBUG nova.network.neutron [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.805 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:54 np0005466012 nova_compute[192063]: 2025-10-02 12:16:54.874 2 DEBUG nova.network.neutron [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Successfully created port: 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:16:55 np0005466012 nova_compute[192063]: 2025-10-02 12:16:55.127 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:16:56 np0005466012 podman[231640]: 2025-10-02 12:16:56.150496814 +0000 UTC m=+0.070459465 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:16:56 np0005466012 podman[231641]: 2025-10-02 12:16:56.17245638 +0000 UTC m=+0.086861388 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.196 2 DEBUG nova.network.neutron [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Successfully updated port: 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.220 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "refresh_cache-6f5eb8d8-6d7c-4666-ace5-49baf3909221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.220 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquired lock "refresh_cache-6f5eb8d8-6d7c-4666-ace5-49baf3909221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.220 2 DEBUG nova.network.neutron [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.456 2 DEBUG nova.compute.manager [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-changed-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.456 2 DEBUG nova.compute.manager [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Refreshing instance network info cache due to event network-changed-5b2477aa-8155-4265-b3e5-0fd41f6b2d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.456 2 DEBUG oslo_concurrency.lockutils [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-6f5eb8d8-6d7c-4666-ace5-49baf3909221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:16:56 np0005466012 nova_compute[192063]: 2025-10-02 12:16:56.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:57 np0005466012 nova_compute[192063]: 2025-10-02 12:16:57.112 2 DEBUG nova.network.neutron [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.007 2 DEBUG nova.network.neutron [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Updating instance_info_cache with network_info: [{"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.040 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Releasing lock "refresh_cache-6f5eb8d8-6d7c-4666-ace5-49baf3909221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.040 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Instance network_info: |[{"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.041 2 DEBUG oslo_concurrency.lockutils [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-6f5eb8d8-6d7c-4666-ace5-49baf3909221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.041 2 DEBUG nova.network.neutron [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Refreshing network info cache for port 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.044 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Start _get_guest_xml network_info=[{"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.048 2 WARNING nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.058 2 DEBUG nova.virt.libvirt.host [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.059 2 DEBUG nova.virt.libvirt.host [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.067 2 DEBUG nova.virt.libvirt.host [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.068 2 DEBUG nova.virt.libvirt.host [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.069 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.070 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.070 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.070 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.071 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.071 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.071 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.072 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.072 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.072 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.073 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.073 2 DEBUG nova.virt.hardware [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.077 2 DEBUG nova.virt.libvirt.vif [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-375307851',display_name='tempest-ListServerFiltersTestJSON-instance-375307851',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-375307851',id=86,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e0277f0bb0f4a349e2e6d8ddfa24edf',ramdisk_id='',reservation_id='r-t8q007lp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-298715262',owner_user_name='tempest-ListServerFiltersTestJSON-298715262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:53Z,user_data=None,user_id='001d2d51902d4e299b775131f430a5db',uuid=6f5eb8d8-6d7c-4666-ace5-49baf3909221,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.078 2 DEBUG nova.network.os_vif_util [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converting VIF {"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.079 2 DEBUG nova.network.os_vif_util [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.080 2 DEBUG nova.objects.instance [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lazy-loading 'pci_devices' on Instance uuid 6f5eb8d8-6d7c-4666-ace5-49baf3909221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.102 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <uuid>6f5eb8d8-6d7c-4666-ace5-49baf3909221</uuid>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <name>instance-00000056</name>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-375307851</nova:name>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:16:59</nova:creationTime>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:user uuid="001d2d51902d4e299b775131f430a5db">tempest-ListServerFiltersTestJSON-298715262-project-member</nova:user>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:project uuid="6e0277f0bb0f4a349e2e6d8ddfa24edf">tempest-ListServerFiltersTestJSON-298715262</nova:project>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        <nova:port uuid="5b2477aa-8155-4265-b3e5-0fd41f6b2d83">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <entry name="serial">6f5eb8d8-6d7c-4666-ace5-49baf3909221</entry>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <entry name="uuid">6f5eb8d8-6d7c-4666-ace5-49baf3909221</entry>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.config"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:ff:03:1f"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <target dev="tap5b2477aa-81"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/console.log" append="off"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:16:59 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:16:59 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:16:59 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:16:59 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.103 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Preparing to wait for external event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.103 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.104 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.104 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.105 2 DEBUG nova.virt.libvirt.vif [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-375307851',display_name='tempest-ListServerFiltersTestJSON-instance-375307851',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-375307851',id=86,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e0277f0bb0f4a349e2e6d8ddfa24edf',ramdisk_id='',reservation_id='r-t8q007lp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-298715262',owner_user_name='tempest-ListServerFiltersTestJSON-298715262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:53Z,user_data=None,user_id='001d2d51902d4e299b775131f430a5db',uuid=6f5eb8d8-6d7c-4666-ace5-49baf3909221,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.106 2 DEBUG nova.network.os_vif_util [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converting VIF {"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.108 2 DEBUG nova.network.os_vif_util [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.109 2 DEBUG os_vif [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.111 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.111 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.116 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b2477aa-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.116 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b2477aa-81, col_values=(('external_ids', {'iface-id': '5b2477aa-8155-4265-b3e5-0fd41f6b2d83', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:03:1f', 'vm-uuid': '6f5eb8d8-6d7c-4666-ace5-49baf3909221'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:59 np0005466012 NetworkManager[51207]: <info>  [1759407419.1206] manager: (tap5b2477aa-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.125 2 INFO os_vif [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81')#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.198 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.199 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.199 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] No VIF found with MAC fa:16:3e:ff:03:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:16:59 np0005466012 nova_compute[192063]: 2025-10-02 12:16:59.200 2 INFO nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Using config drive#033[00m
Oct  2 08:17:00 np0005466012 nova_compute[192063]: 2025-10-02 12:17:00.659 2 INFO nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Creating config drive at /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.config#033[00m
Oct  2 08:17:00 np0005466012 nova_compute[192063]: 2025-10-02 12:17:00.664 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6fo6b_2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:00 np0005466012 nova_compute[192063]: 2025-10-02 12:17:00.792 2 DEBUG oslo_concurrency.processutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6fo6b_2" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:00 np0005466012 kernel: tap5b2477aa-81: entered promiscuous mode
Oct  2 08:17:00 np0005466012 NetworkManager[51207]: <info>  [1759407420.8715] manager: (tap5b2477aa-81): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Oct  2 08:17:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:00Z|00302|binding|INFO|Claiming lport 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 for this chassis.
Oct  2 08:17:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:00Z|00303|binding|INFO|5b2477aa-8155-4265-b3e5-0fd41f6b2d83: Claiming fa:16:3e:ff:03:1f 10.100.0.9
Oct  2 08:17:00 np0005466012 nova_compute[192063]: 2025-10-02 12:17:00.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:00 np0005466012 systemd-udevd[231717]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:00 np0005466012 nova_compute[192063]: 2025-10-02 12:17:00.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:00 np0005466012 NetworkManager[51207]: <info>  [1759407420.9116] device (tap5b2477aa-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:17:00 np0005466012 NetworkManager[51207]: <info>  [1759407420.9126] device (tap5b2477aa-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:17:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:00Z|00304|binding|INFO|Setting lport 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 ovn-installed in OVS
Oct  2 08:17:00 np0005466012 nova_compute[192063]: 2025-10-02 12:17:00.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:00 np0005466012 systemd-machined[152114]: New machine qemu-36-instance-00000056.
Oct  2 08:17:00 np0005466012 podman[231694]: 2025-10-02 12:17:00.928245662 +0000 UTC m=+0.063378950 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:17:00 np0005466012 systemd[1]: Started Virtual Machine qemu-36-instance-00000056.
Oct  2 08:17:00 np0005466012 podman[231693]: 2025-10-02 12:17:00.960529142 +0000 UTC m=+0.095738422 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  2 08:17:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:01Z|00305|binding|INFO|Setting lport 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 up in Southbound
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.268 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:03:1f 10.100.0.9'], port_security=['fa:16:3e:ff:03:1f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4e0bc42-3cfd-4f42-a319-553606576b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a043239b-039e-45fa-8277-43e361a8bae7, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5b2477aa-8155-4265-b3e5-0fd41f6b2d83) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.270 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 in datapath bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 bound to our chassis#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.273 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.287 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[92b44fa2-8356-441d-b05f-90eaad4430d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.288 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd543a6a-b1 in ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.290 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd543a6a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.290 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb593c86-9a3c-47ca-b894-d4dae82f8b9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.291 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[869a7147-8bfa-449b-8c6f-103d809a8092]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.305 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[8e36bf60-e18a-4b00-a555-ae1528e915b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 nova_compute[192063]: 2025-10-02 12:17:01.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.353 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a83e828f-14ac-4438-ba2f-9d5935a46186]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.387 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e31a8b44-aef3-4f63-b8a1-166c76e5526c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 NetworkManager[51207]: <info>  [1759407421.3939] manager: (tapbd543a6a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Oct  2 08:17:01 np0005466012 systemd-udevd[231733]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.393 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0c429da6-50c7-408c-b222-862a0828e8f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.427 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fdfbda16-831e-4413-842b-d4873641469f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.430 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d78497f2-b9b8-4c16-ab6d-578fe23f1aa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 NetworkManager[51207]: <info>  [1759407421.4592] device (tapbd543a6a-b0): carrier: link connected
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.469 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0faa77-3672-4694-a1c6-6bfcaf056114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.495 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[df213f94-0aef-4082-b29d-8d996f442e17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd543a6a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:7a:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541507, 'reachable_time': 43043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231776, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.537 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2613478d-45b9-4612-a1cc-743b17b6ef4f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe71:7a4a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541507, 'tstamp': 541507}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231778, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.557 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7407f880-7bf4-479a-ade4-96d4e6b43a8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd543a6a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:7a:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541507, 'reachable_time': 43043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231784, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.592 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c905d8d5-6c19-48f8-bd4d-6a41e2146d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.677 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c6cfdf34-c65e-4d70-a631-b8a819dacec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.679 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd543a6a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.679 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.680 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd543a6a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:01 np0005466012 nova_compute[192063]: 2025-10-02 12:17:01.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466012 kernel: tapbd543a6a-b0: entered promiscuous mode
Oct  2 08:17:01 np0005466012 NetworkManager[51207]: <info>  [1759407421.6825] manager: (tapbd543a6a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.685 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd543a6a-b0, col_values=(('external_ids', {'iface-id': '1bd1cb43-f90b-4e8c-92cc-e89ec36a0b0f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:01Z|00306|binding|INFO|Releasing lport 1bd1cb43-f90b-4e8c-92cc-e89ec36a0b0f from this chassis (sb_readonly=0)
Oct  2 08:17:01 np0005466012 nova_compute[192063]: 2025-10-02 12:17:01.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.703 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.704 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5842bf-ba16-4dfc-8c0a-6c66843d9fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.705 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.pid.haproxy
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:17:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:01.707 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'env', 'PROCESS_TAG=haproxy-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.111 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407422.1104681, 6f5eb8d8-6d7c-4666-ace5-49baf3909221 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.112 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] VM Started (Lifecycle Event)#033[00m
Oct  2 08:17:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:02.128 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:02.128 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:02.129 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.165 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.168 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407422.1114233, 6f5eb8d8-6d7c-4666-ace5-49baf3909221 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.168 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.201 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.204 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.248 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:17:02 np0005466012 podman[231822]: 2025-10-02 12:17:02.527774775 +0000 UTC m=+0.032713433 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.637 2 DEBUG nova.compute.manager [req-963bf78c-09a2-49fa-be4c-3ba128c3d34f req-b285702e-b565-4235-911b-65d8022da519 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.638 2 DEBUG oslo_concurrency.lockutils [req-963bf78c-09a2-49fa-be4c-3ba128c3d34f req-b285702e-b565-4235-911b-65d8022da519 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.638 2 DEBUG oslo_concurrency.lockutils [req-963bf78c-09a2-49fa-be4c-3ba128c3d34f req-b285702e-b565-4235-911b-65d8022da519 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.638 2 DEBUG oslo_concurrency.lockutils [req-963bf78c-09a2-49fa-be4c-3ba128c3d34f req-b285702e-b565-4235-911b-65d8022da519 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.639 2 DEBUG nova.compute.manager [req-963bf78c-09a2-49fa-be4c-3ba128c3d34f req-b285702e-b565-4235-911b-65d8022da519 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Processing event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.639 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.642 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407422.6424842, 6f5eb8d8-6d7c-4666-ace5-49baf3909221 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.643 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.644 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.646 2 INFO nova.virt.libvirt.driver [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Instance spawned successfully.#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.647 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.687 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.693 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.693 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.694 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.694 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.695 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.695 2 DEBUG nova.virt.libvirt.driver [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.700 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.741 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.834 2 INFO nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Took 8.75 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.834 2 DEBUG nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.961 2 INFO nova.compute.manager [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Took 9.42 seconds to build instance.#033[00m
Oct  2 08:17:02 np0005466012 podman[231822]: 2025-10-02 12:17:02.964480815 +0000 UTC m=+0.469419453 container create 2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:17:02 np0005466012 nova_compute[192063]: 2025-10-02 12:17:02.989 2 DEBUG oslo_concurrency.lockutils [None req-23ef9032-efdb-4bc7-a518-30ba117c373e 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:03 np0005466012 systemd[1]: Started libpod-conmon-2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26.scope.
Oct  2 08:17:03 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:17:03 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a35145200f21a059117e8589ed6ea42797937a38c3a5cfa8dd8ba7bae7729ce1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:17:03 np0005466012 nova_compute[192063]: 2025-10-02 12:17:03.132 2 DEBUG nova.network.neutron [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Updated VIF entry in instance network info cache for port 5b2477aa-8155-4265-b3e5-0fd41f6b2d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:03 np0005466012 nova_compute[192063]: 2025-10-02 12:17:03.133 2 DEBUG nova.network.neutron [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Updating instance_info_cache with network_info: [{"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:03 np0005466012 podman[231822]: 2025-10-02 12:17:03.158202401 +0000 UTC m=+0.663141039 container init 2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:17:03 np0005466012 nova_compute[192063]: 2025-10-02 12:17:03.161 2 DEBUG oslo_concurrency.lockutils [req-b00212be-71f3-4a78-8640-ab87c6627f90 req-8e80d1fb-9afb-4e83-8d54-d8633242aaa0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-6f5eb8d8-6d7c-4666-ace5-49baf3909221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:03 np0005466012 podman[231822]: 2025-10-02 12:17:03.16395297 +0000 UTC m=+0.668891608 container start 2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:17:03 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [NOTICE]   (231861) : New worker (231863) forked
Oct  2 08:17:03 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [NOTICE]   (231861) : Loading success.
Oct  2 08:17:04 np0005466012 nova_compute[192063]: 2025-10-02 12:17:04.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:05 np0005466012 nova_compute[192063]: 2025-10-02 12:17:05.173 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:17:06 np0005466012 nova_compute[192063]: 2025-10-02 12:17:06.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.292 2 DEBUG nova.compute.manager [req-0bc77d18-1de7-4c3a-9a0c-ba8519a01cdb req-313a8623-9c3d-4981-a7ae-41766bb29059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.292 2 DEBUG oslo_concurrency.lockutils [req-0bc77d18-1de7-4c3a-9a0c-ba8519a01cdb req-313a8623-9c3d-4981-a7ae-41766bb29059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.293 2 DEBUG oslo_concurrency.lockutils [req-0bc77d18-1de7-4c3a-9a0c-ba8519a01cdb req-313a8623-9c3d-4981-a7ae-41766bb29059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.293 2 DEBUG oslo_concurrency.lockutils [req-0bc77d18-1de7-4c3a-9a0c-ba8519a01cdb req-313a8623-9c3d-4981-a7ae-41766bb29059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.293 2 DEBUG nova.compute.manager [req-0bc77d18-1de7-4c3a-9a0c-ba8519a01cdb req-313a8623-9c3d-4981-a7ae-41766bb29059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] No waiting events found dispatching network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.294 2 WARNING nova.compute.manager [req-0bc77d18-1de7-4c3a-9a0c-ba8519a01cdb req-313a8623-9c3d-4981-a7ae-41766bb29059 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received unexpected event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:17:07 np0005466012 kernel: tapd0f247a6-f3 (unregistering): left promiscuous mode
Oct  2 08:17:07 np0005466012 NetworkManager[51207]: <info>  [1759407427.8238] device (tapd0f247a6-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:07Z|00307|binding|INFO|Releasing lport d0f247a6-f336-4b2f-a423-7b03a80a5228 from this chassis (sb_readonly=0)
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:07Z|00308|binding|INFO|Setting lport d0f247a6-f336-4b2f-a423-7b03a80a5228 down in Southbound
Oct  2 08:17:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:07Z|00309|binding|INFO|Removing iface tapd0f247a6-f3 ovn-installed in OVS
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:07 np0005466012 nova_compute[192063]: 2025-10-02 12:17:07.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:07 np0005466012 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000054.scope: Deactivated successfully.
Oct  2 08:17:07 np0005466012 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000054.scope: Consumed 15.333s CPU time.
Oct  2 08:17:07 np0005466012 systemd-machined[152114]: Machine qemu-35-instance-00000054 terminated.
Oct  2 08:17:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:08.151 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:a3:d0 10.100.0.10'], port_security=['fa:16:3e:5a:a3:d0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3cafb9b-5ea3-48cb-b4f5-616692db21f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '999a48d299e548dfa3ec622cf07f7017', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b61602f4-509b-434a-8fef-8040306ea771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6778b39e-f647-49e5-a839-dee5291ea3a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d0f247a6-f336-4b2f-a423-7b03a80a5228) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:08.154 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d0f247a6-f336-4b2f-a423-7b03a80a5228 in datapath d3cafb9b-5ea3-48cb-b4f5-616692db21f6 unbound from our chassis#033[00m
Oct  2 08:17:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:08.155 103246 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d3cafb9b-5ea3-48cb-b4f5-616692db21f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:17:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:08.157 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[89350417-c979-411e-a37e-4ac21af563f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.188 2 INFO nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.193 2 INFO nova.virt.libvirt.driver [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance destroyed successfully.#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.194 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'numa_topology' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.232 2 INFO nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Attempting rescue#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.233 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.236 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.237 2 INFO nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Creating image(s)#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.238 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.238 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.238 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.239 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.264 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.265 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.274 2 DEBUG oslo_concurrency.processutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.356 2 DEBUG oslo_concurrency.processutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.357 2 DEBUG oslo_concurrency.processutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.396 2 DEBUG oslo_concurrency.processutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.398 2 DEBUG oslo_concurrency.lockutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.398 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'migration_context' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.428 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.429 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Start _get_guest_xml network_info=[{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "vif_mac": "fa:16:3e:5a:a3:d0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.429 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'resources' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.539 2 WARNING nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.544 2 DEBUG nova.virt.libvirt.host [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.545 2 DEBUG nova.virt.libvirt.host [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.548 2 DEBUG nova.virt.libvirt.host [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.548 2 DEBUG nova.virt.libvirt.host [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.550 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.550 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.551 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.551 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.551 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.551 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.552 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.552 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.552 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.552 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.553 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.553 2 DEBUG nova.virt.hardware [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.553 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.596 2 DEBUG nova.compute.manager [req-a65d0209-6964-4fdd-87e3-17070654adeb req-6ef25a52-dca4-427f-b389-d5b802ddb001 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-unplugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.597 2 DEBUG oslo_concurrency.lockutils [req-a65d0209-6964-4fdd-87e3-17070654adeb req-6ef25a52-dca4-427f-b389-d5b802ddb001 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.597 2 DEBUG oslo_concurrency.lockutils [req-a65d0209-6964-4fdd-87e3-17070654adeb req-6ef25a52-dca4-427f-b389-d5b802ddb001 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.597 2 DEBUG oslo_concurrency.lockutils [req-a65d0209-6964-4fdd-87e3-17070654adeb req-6ef25a52-dca4-427f-b389-d5b802ddb001 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.598 2 DEBUG nova.compute.manager [req-a65d0209-6964-4fdd-87e3-17070654adeb req-6ef25a52-dca4-427f-b389-d5b802ddb001 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-unplugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.598 2 WARNING nova.compute.manager [req-a65d0209-6964-4fdd-87e3-17070654adeb req-6ef25a52-dca4-427f-b389-d5b802ddb001 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received unexpected event network-vif-unplugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.605 2 DEBUG nova.virt.libvirt.vif [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-354540198',display_name='tempest-ServerRescueTestJSONUnderV235-server-354540198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-354540198',id=84,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:16:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='999a48d299e548dfa3ec622cf07f7017',ramdisk_id='',reservation_id='r-rovb30q3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-125577537',owner_user_name='tempest-ServerRescueTestJSONUnderV235-125577537-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:16:50Z,user_data=None,user_id='ebfa5d25510d4f4f8b7c9a6cf0b8c9b1',uuid=c0859f36-89c4-4534-aba5-d5373464c64f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "vif_mac": "fa:16:3e:5a:a3:d0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.606 2 DEBUG nova.network.os_vif_util [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converting VIF {"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "vif_mac": "fa:16:3e:5a:a3:d0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.607 2 DEBUG nova.network.os_vif_util [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.607 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'pci_devices' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.634 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <uuid>c0859f36-89c4-4534-aba5-d5373464c64f</uuid>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <name>instance-00000054</name>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-354540198</nova:name>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:17:08</nova:creationTime>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:user uuid="ebfa5d25510d4f4f8b7c9a6cf0b8c9b1">tempest-ServerRescueTestJSONUnderV235-125577537-project-member</nova:user>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:project uuid="999a48d299e548dfa3ec622cf07f7017">tempest-ServerRescueTestJSONUnderV235-125577537</nova:project>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        <nova:port uuid="d0f247a6-f336-4b2f-a423-7b03a80a5228">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <entry name="serial">c0859f36-89c4-4534-aba5-d5373464c64f</entry>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <entry name="uuid">c0859f36-89c4-4534-aba5-d5373464c64f</entry>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config.rescue"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:5a:a3:d0"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <target dev="tapd0f247a6-f3"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/console.log" append="off"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:17:08 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:17:08 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:17:08 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:17:08 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.649 2 INFO nova.virt.libvirt.driver [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance destroyed successfully.#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.736 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.736 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.737 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.737 2 DEBUG nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] No VIF found with MAC fa:16:3e:5a:a3:d0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.737 2 INFO nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Using config drive#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.798 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'ec2_ids' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:08 np0005466012 nova_compute[192063]: 2025-10-02 12:17:08.864 2 DEBUG nova.objects.instance [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'keypairs' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.387 2 INFO nova.virt.libvirt.driver [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Creating config drive at /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config.rescue#033[00m
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.391 2 DEBUG oslo_concurrency.processutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz0ltoaws execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.518 2 DEBUG oslo_concurrency.processutils [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz0ltoaws" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:09 np0005466012 kernel: tapd0f247a6-f3: entered promiscuous mode
Oct  2 08:17:09 np0005466012 systemd-udevd[231875]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:09Z|00310|binding|INFO|Claiming lport d0f247a6-f336-4b2f-a423-7b03a80a5228 for this chassis.
Oct  2 08:17:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:09Z|00311|binding|INFO|d0f247a6-f336-4b2f-a423-7b03a80a5228: Claiming fa:16:3e:5a:a3:d0 10.100.0.10
Oct  2 08:17:09 np0005466012 NetworkManager[51207]: <info>  [1759407429.6145] manager: (tapd0f247a6-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Oct  2 08:17:09 np0005466012 NetworkManager[51207]: <info>  [1759407429.6159] device (tapd0f247a6-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:17:09 np0005466012 NetworkManager[51207]: <info>  [1759407429.6174] device (tapd0f247a6-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:17:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:09Z|00312|binding|INFO|Setting lport d0f247a6-f336-4b2f-a423-7b03a80a5228 ovn-installed in OVS
Oct  2 08:17:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:09Z|00313|binding|INFO|Setting lport d0f247a6-f336-4b2f-a423-7b03a80a5228 up in Southbound
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:09.621 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:a3:d0 10.100.0.10'], port_security=['fa:16:3e:5a:a3:d0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3cafb9b-5ea3-48cb-b4f5-616692db21f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '999a48d299e548dfa3ec622cf07f7017', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b61602f4-509b-434a-8fef-8040306ea771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6778b39e-f647-49e5-a839-dee5291ea3a3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d0f247a6-f336-4b2f-a423-7b03a80a5228) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:09.624 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d0f247a6-f336-4b2f-a423-7b03a80a5228 in datapath d3cafb9b-5ea3-48cb-b4f5-616692db21f6 bound to our chassis#033[00m
Oct  2 08:17:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:09.625 103246 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d3cafb9b-5ea3-48cb-b4f5-616692db21f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:17:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:09.626 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe7a0ee-e518-4eef-aba6-21a59154c63b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:09 np0005466012 nova_compute[192063]: 2025-10-02 12:17:09.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:09 np0005466012 systemd-machined[152114]: New machine qemu-37-instance-00000054.
Oct  2 08:17:09 np0005466012 systemd[1]: Started Virtual Machine qemu-37-instance-00000054.
Oct  2 08:17:10 np0005466012 nova_compute[192063]: 2025-10-02 12:17:10.914 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for c0859f36-89c4-4534-aba5-d5373464c64f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:17:10 np0005466012 nova_compute[192063]: 2025-10-02 12:17:10.915 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407430.9138896, c0859f36-89c4-4534-aba5-d5373464c64f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:10 np0005466012 nova_compute[192063]: 2025-10-02 12:17:10.915 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:17:10 np0005466012 nova_compute[192063]: 2025-10-02 12:17:10.954 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:10 np0005466012 nova_compute[192063]: 2025-10-02 12:17:10.958 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.113 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.113 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407430.9175546, c0859f36-89c4-4534-aba5-d5373464c64f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.113 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] VM Started (Lifecycle Event)#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.116 2 DEBUG nova.compute.manager [None req-93ac9775-53fc-4789-bc49-88fc7536f0ee ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.199 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.204 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.666 2 DEBUG nova.compute.manager [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.667 2 DEBUG oslo_concurrency.lockutils [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.668 2 DEBUG oslo_concurrency.lockutils [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.668 2 DEBUG oslo_concurrency.lockutils [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.669 2 DEBUG nova.compute.manager [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.669 2 WARNING nova.compute.manager [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received unexpected event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.670 2 DEBUG nova.compute.manager [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.670 2 DEBUG oslo_concurrency.lockutils [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.671 2 DEBUG oslo_concurrency.lockutils [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.671 2 DEBUG oslo_concurrency.lockutils [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.672 2 DEBUG nova.compute.manager [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:11 np0005466012 nova_compute[192063]: 2025-10-02 12:17:11.672 2 WARNING nova.compute.manager [req-672489d5-b2f5-4cfc-be43-34d106cbf405 req-fbfd54b5-e610-4e55-9151-5ce1fe88c96e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received unexpected event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:17:13 np0005466012 nova_compute[192063]: 2025-10-02 12:17:13.885 2 DEBUG nova.compute.manager [req-5641d86a-99bb-4b49-9f88-fbdc040e430b req-44fcdfd9-f2af-44cf-bb7c-42aefa258c28 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:13 np0005466012 nova_compute[192063]: 2025-10-02 12:17:13.885 2 DEBUG oslo_concurrency.lockutils [req-5641d86a-99bb-4b49-9f88-fbdc040e430b req-44fcdfd9-f2af-44cf-bb7c-42aefa258c28 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:13 np0005466012 nova_compute[192063]: 2025-10-02 12:17:13.885 2 DEBUG oslo_concurrency.lockutils [req-5641d86a-99bb-4b49-9f88-fbdc040e430b req-44fcdfd9-f2af-44cf-bb7c-42aefa258c28 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:13 np0005466012 nova_compute[192063]: 2025-10-02 12:17:13.886 2 DEBUG oslo_concurrency.lockutils [req-5641d86a-99bb-4b49-9f88-fbdc040e430b req-44fcdfd9-f2af-44cf-bb7c-42aefa258c28 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:13 np0005466012 nova_compute[192063]: 2025-10-02 12:17:13.886 2 DEBUG nova.compute.manager [req-5641d86a-99bb-4b49-9f88-fbdc040e430b req-44fcdfd9-f2af-44cf-bb7c-42aefa258c28 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:13 np0005466012 nova_compute[192063]: 2025-10-02 12:17:13.886 2 WARNING nova.compute.manager [req-5641d86a-99bb-4b49-9f88-fbdc040e430b req-44fcdfd9-f2af-44cf-bb7c-42aefa258c28 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received unexpected event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:17:14 np0005466012 nova_compute[192063]: 2025-10-02 12:17:14.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:15 np0005466012 podman[231936]: 2025-10-02 12:17:15.148713243 +0000 UTC m=+0.058657069 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:17:15 np0005466012 podman[231937]: 2025-10-02 12:17:15.199634598 +0000 UTC m=+0.102824409 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:17:16 np0005466012 nova_compute[192063]: 2025-10-02 12:17:16.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:16Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:03:1f 10.100.0.9
Oct  2 08:17:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:16Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:03:1f 10.100.0.9
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.922 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000054', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '999a48d299e548dfa3ec622cf07f7017', 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'hostId': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.924 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000056', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'user_id': '001d2d51902d4e299b775131f430a5db', 'hostId': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.927 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c0859f36-89c4-4534-aba5-d5373464c64f / tapd0f247a6-f3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.927 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.930 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6f5eb8d8-6d7c-4666-ace5-49baf3909221 / tap5b2477aa-81 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.931 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3175306-6abf-4183-a3a8-35c4aad8ee59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:16.925277', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc70128c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '291df23c04b671137ed6c6442f9bbb0ef6e2bb649b90b9fcfb88762dcf18bdec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:16.925277', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc7088fc-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '5cb13756ed9353d85ac5891988b4801c64c1d95047145f979dc273ce958d55d1'}]}, 'timestamp': '2025-10-02 12:17:16.931423', '_unique_id': '0449109d6c89435a8e2a6efe7b5d2970'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.932 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.933 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.960 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.960 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c0859f36-89c4-4534-aba5-d5373464c64f: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.978 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee748d73-a9b6-48fd-a9d6-2e61098ff7e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'timestamp': '2025-10-02T12:17:16.933827', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'bc77b8ca-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.654276706, 'message_signature': 'c3d8a75dea899507db43d9bc005d92650e0c35e3e8b2495bd328cf9a4086e572'}]}, 'timestamp': '2025-10-02 12:17:16.978567', '_unique_id': '631a2b6587d24c3384d1856338ed4bf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.980 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.980 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98de0f09-d35c-4e26-a3b8-5768ae7958e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:16.980505', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc781162-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '602e5a575551e37c3125cab4c08919f8185616367cefa6fa4d89dcd59f5360e1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:16.980505', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc781bee-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '2142190560cabfd4f720270d5e851ff3054ba30cfd103e8af5f14a5c69ef4ac2'}]}, 'timestamp': '2025-10-02 12:17:16.981042', '_unique_id': '13d256b276354cb8b583983e1463cde8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.982 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/cpu volume: 5840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/cpu volume: 12550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21a01dd5-2193-4971-8f8d-f60762a27ddd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5840000000, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'timestamp': '2025-10-02T12:17:16.982836', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bc786c7a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.636407482, 'message_signature': 'e5a61cbee91ac6505d8b07524004073a061a60c8faade339581ea2d6c4e11c24'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12550000000, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'timestamp': '2025-10-02T12:17:16.982836', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bc7876ca-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.654276706, 'message_signature': 'bae22ed96b8990c38febd3e319fb3e94b7b3c5fdd3a14efaab35d78977350e10'}]}, 'timestamp': '2025-10-02 12:17:16.983363', '_unique_id': 'a475960fefdb4144b312a68d9ca77f54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.983 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:16.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.009 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.010 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.010 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.021 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.021 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '830fd0ce-b495-4332-b769-2c734d71174c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:16.984899', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc7c8a6c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': '90b9d7423805f3d7363da3f8cfedbfbc80ae73cf095d1c94a3394ecdca0f515d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:16.984899', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc7c9d9a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': '975aeac581c5852df31e41b519c7cc61817a9d3d055ac0be0531310aa3e5b1b0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:16.984899', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc7ca88a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': '1f72d2728e905bbbc1071fc3b133dd411c008e6d38fe4bfc4ce507c29161697d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:16.984899', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc7e5662-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.687220424, 'message_signature': 'cb3909696f7c3d3f91094714c17587c4567a4b0aaf354de0eb9df9577d78b3a7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:16.984899', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc7e62d8-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.687220424, 'message_signature': '9b17148fded8232fc134a306f1f3b9fd2a5080eabd7cf0bee2138ae69e7b68ff'}]}, 'timestamp': '2025-10-02 12:17:17.022187', '_unique_id': 'd7fd3d8282a04f95a5ed2e272295e80a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.023 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.024 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.024 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3370bfbf-0971-451f-8bab-b18da7078937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.024440', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc7ec76e-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '5cbabe053bf19b1b365aa169bdb3a2a03660af8176f0bcbe1b3cf846bdba5707'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.024440', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc7ed420-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': 'c2b56d9059da674b77f7473237b850a6cfa4a0b6162e8e110f67989add16f695'}]}, 'timestamp': '2025-10-02 12:17:17.025083', '_unique_id': 'f0a5b657cf474da29e179378a9ba430f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.109 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.110 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.110 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.132 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.read.bytes volume: 30517760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.133 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f0f90ce-c7c1-4985-83a3-6875c4878bc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.026597', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc8bd4f4-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '5dffd6a0afbf173003c6a292cdd115bb1f342f701fe8246aefaa19017a5d6a4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.026597', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc8be96c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '47b4e3c5081e36e508ca88857371eb7b595b6c0344ef8e9d8e5dadc15292cc2c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.026597', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc8bf696-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '4e601685b5883b640d2b86dbd13a5b2f253e11624d04b5697d474c981d2ca2b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30517760, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.026597', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc8f53b8-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': 'f308219ec5da6572fb7904af7d95ff2357415e54e9824ff83a6da1df37abb39a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.026597', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc8f6312-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': 'f64bb890bb86da6e02feaca8a65870e2fff117b360df8c2f038d3aa356fe2705'}]}, 'timestamp': '2025-10-02 12:17:17.133643', '_unique_id': 'f1345848cd7e42af9d97440090914fbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.135 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.136 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.137 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.137 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.incoming.bytes volume: 1694 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dd9837c-96fb-451b-a7ca-2260d12f293c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.137447', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc9008da-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': 'dc5532259763690b5b95b224161ed5e8d2fd8aed72beafd9f973637f354b6af3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1694, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.137447', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc9017b2-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '5edc80b2777bd48e48919310a18531ce45d8e425f2ab78b46fa378a7ad32b2f1'}]}, 'timestamp': '2025-10-02 12:17:17.138260', '_unique_id': '497004c5dcba49ff92fb63342d2a2ef3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.139 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.140 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.140 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.outgoing.bytes volume: 992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67d44be4-dce8-40b9-a629-aada7b82493e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.140407', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc9079e6-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '29280ea0b06f3d78f85c1be5021614f34a1869f08ee86c48bf11edf60c60fd75'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 992, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.140407', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc908a80-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '36ec7e939c2942a6a9f2c6b1589fb959b01814c0c7d3703bb746467596f504ea'}]}, 'timestamp': '2025-10-02 12:17:17.141201', '_unique_id': '17db9a17c74d414b99ffea89fbe7ddea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.141 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.143 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.143 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c1579ad-94b3-4642-9910-86c0c9ae6b94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.143303', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc90eade-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '06829755f7f5071f14b14269eebedb709c453f1539ea97f0153af8286a8d7b04'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.143303', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc90fc04-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '29fb71d8815965ca1e4b945108dfe86d1378fded58f05cac4c318cab9af2c16d'}]}, 'timestamp': '2025-10-02 12:17:17.144107', '_unique_id': '6204cc904b324c0bb88a47b88833dad0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.144 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.146 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.146 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.147 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.147 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.147 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd2c7c80-d32b-4417-bdeb-5eee2f58e7b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.146154', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc915a3c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '27141d8d42c138085354edfd698b18683f3968f694a0d3017d2ad854915106ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.146154', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc916c98-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'b26b0e36f58dc06391820415b8e2f14ad3b93bf9e4f1a57987f9644d590343f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.146154', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc917d14-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '056de854bf98c9fcbd449e9acd2b8b65960503f15cf5d0d6128ea3439b25bdbb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.146154', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc918b10-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': 'c4042ea3c959ca3958205cf2bbd4a061921398de9274df8285625f0e04b6b208'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.146154', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc9198f8-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': 'd48791b772549eecb5344b70af9efcf10924ea614d64e56f1dee4fd2f1947ebc'}]}, 'timestamp': '2025-10-02 12:17:17.148112', '_unique_id': 'c51816692a7e4c87bf7834d4cfbe4fbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.149 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.150 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.150 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.151 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.151 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.151 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.152 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.152 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '213c24e7-6cb3-4851-bf62-8d94ef90d0e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.151083', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc921a58-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': 'd85c0592b347e4d9666c0d6768c5bb2a2a824e5a9736c421ed4462f7fbd61b2c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.151083', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc922c1e-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': '6826eab8f514cd28c2b7e1167996d247aa837cdc2fa872beea74eaacf234507d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.151083', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc9239de-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': 'e12198a9a6c0b0fd8b7ddc2385bfac730f515a049121e328de5b20127d54d793'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.151083', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc924adc-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.687220424, 'message_signature': 'eb4ade8baa6232b530b89917818c616426fec044077a19235b4aab84c8afbdb9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.151083', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc9257ca-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.687220424, 'message_signature': 'd637b76be5a34c18b74b01e5958a50c7dad2be34bdcdb8b391881669c755aecb'}]}, 'timestamp': '2025-10-02 12:17:17.152993', '_unique_id': '7cd7302ee6bd46d3a4940a17127af017'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.153 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.155 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.155 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.155 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.156 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23e1511a-7c74-4fb1-993f-15185dde9236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.155676', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc92cfca-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': 'd1424d08df644ab44fbd76a6a8a4ad6682cc6ce1db4f48bf303e25f56a0bcfbb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.155676', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc92ddd0-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '7cb5fecf6cb0ce5af5c9d1cfe6e3493b7f82452a1250ff4ba49adc8a89237255'}]}, 'timestamp': '2025-10-02 12:17:17.156437', '_unique_id': '872c1358bfff405797d873ab3258d0b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.157 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.158 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.158 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-354540198>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-375307851>]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.159 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.latency volume: 536483232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.159 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.159 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.read.latency volume: 3785434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.160 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.read.latency volume: 2098338198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.160 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.read.latency volume: 299020226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd48c4caf-5f74-4dc8-b146-022d3f5dce84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 536483232, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.159043', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc935116-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'a53262c08c09234ea28be858ef46ea2561980bd866c0db934b4c8d761d3612e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.159043', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc935ecc-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '5456c3532bc19c7858ab6ef99e1cd3f0a24beb223e0263f92bcdbd45fc61b7ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3785434, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.159043', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc936bf6-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'ad325802eab45a3a79274d419cad8383b83663b57035f829da261fc01c06c247'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2098338198, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.159043', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc9377fe-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '97490eeda6763da49f5ef01099e4f7233437d8c8c72bae5c1651f704bc70e110'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 299020226, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.159043', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc9383e8-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '0ab8b689bd06b1af38e1a6c5bc61b93e39ebd96505a9459fd18213476e046910'}]}, 'timestamp': '2025-10-02 12:17:17.160677', '_unique_id': '5fe100ea68e44635b735e1304a717f4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.161 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.162 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.163 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.163 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.163 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.write.latency volume: 3509664146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.164 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '042b1229-f14c-4a48-9c8c-2c3395ea5688', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.162848', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc93e5cc-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '5c332bc44cf38f512e319f16870079aaa6ecd5faf894be4ecf2a7ffc09257507'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.162848', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc93f27e-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '2a7ec400cecc30a0b83370a975c2123ff6339e13a95ce0b294f164af73e5e17d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.162848', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc940124-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'e951a14703754f7d52c9d4c83c97d0aba5232bfd6c2517c765f40194e7c65ca2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3509664146, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.162848', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc940da4-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '6f5e04ec4622b14580fd10ea16baf3a5224ccc190c48ea4523199d9a4d950ade'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.162848', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc941970-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '742604eb91237c518b7e0609b0e4f164e2998dc0f995f214b41ca8c10969abd5'}]}, 'timestamp': '2025-10-02 12:17:17.164504', '_unique_id': '447c26022f9449d4ac1ae826956bea1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.165 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.166 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.167 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cbe12ff-cc81-4677-97e5-869ccdbc32bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.166788', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc948040-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '04d96436b6b29e211c85dd8a4b3283110ac3a4590046181db23c6e49d08617c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.166788', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc948dc4-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': 'f43695cddfa7f17ed8cdf1c20b679fd6e1c6b231fac84570c358691d713774fe'}]}, 'timestamp': '2025-10-02 12:17:17.167494', '_unique_id': 'd8bb19780bd24153914a6a2ded4669bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.168 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.169 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.170 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c6044f8-2f01-434b-979c-dce414f2df38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.169603', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc94ef1c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '342fb2db7339c2a4333cc91536a374122a96cf6ebcd1665e7dd480997f84de91'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.169603', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc94fd22-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '2d1d84f9155a29efa363564e6b9c1dfcf387e2b7c60c65b2e74fbd849c40351f'}]}, 'timestamp': '2025-10-02 12:17:17.170345', '_unique_id': '669cf095a9204bdd86b0272c49bbc2db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.171 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.172 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.172 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/network.outgoing.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '094f40d9-3bc1-4206-858a-b3bb04365ef5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'instance-00000054-c0859f36-89c4-4534-aba5-d5373464c64f-tapd0f247a6-f3', 'timestamp': '2025-10-02T12:17:17.172232', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'tapd0f247a6-f3', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:a3:d0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd0f247a6-f3'}, 'message_id': 'bc955362-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.601636373, 'message_signature': '550e49befb83b253f740eddd2c3c721e0c79134a2a932493c7fc44b811ceb6ee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': 'instance-00000056-6f5eb8d8-6d7c-4666-ace5-49baf3909221-tap5b2477aa-81', 'timestamp': '2025-10-02T12:17:17.172232', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'tap5b2477aa-81', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:03:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5b2477aa-81'}, 'message_id': 'bc95673a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.604754799, 'message_signature': '784a34138b14ac7b2998ff53b35e1ad8736ace30f4dd29dac8d31f1a6d7982f1'}]}, 'timestamp': '2025-10-02 12:17:17.173106', '_unique_id': '14d6529dabc44521990088491a30d8e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.173 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.174 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.175 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.175 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.176 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.176 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c2c49ac-7fdc-4093-ad6e-27b5f8d99f6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.174963', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc95be24-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': 'c947f5c17f30f2089d414ef460482479ca7fd692566fc23d5e23a4b781b3d8d6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.174963', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc95cb3a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': '51b8e199fd5962f4451edbb6560eac0a84f21cf7ded2191bc18db551e5f4ec7b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.174963', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc95d9c2-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.661281809, 'message_signature': 'e50e3ad751aba4ff5030ba65126617556282348ba50b79ba6e741f66007f2f63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.174963', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc95e64c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.687220424, 'message_signature': 'e6a5e427444a85c6671797d2040e8aed792b523374b90abc41bc6457472be508'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.174963', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc95f272-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.687220424, 'message_signature': '1bab80626b5d2badde23ea882625da46e3ed332474f1496709de10c7ebea2d50'}]}, 'timestamp': '2025-10-02 12:17:17.176651', '_unique_id': '9bc685b0fb104b81b927228ed0efa4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.178 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.179 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.179 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.180 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.180 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.181 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e32d933-403f-4d14-8a9c-8a4cd7d62e68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.179376', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc966b9e-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '48b0f9d9d84856bde6b39d8bae8b4a6b8b4043c21dcf734a842854a2c83250dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.179376', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc967af8-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'b0d203f12120c9d8635ab2114726821d30fd88e5bfc813cfeb93329e37dd9237'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.179376', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc96875a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'd9f0622802edce7564d6bf12cc32786f5eb84d3f7191d41299a4b83a5693a659'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.179376', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc969394-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '13c41d6e73856ae504ead47377e319f81dbf2bacb1702d74396aebc653505064'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.179376', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc96c3dc-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': 'f23e3c99d3a17a6dcf8b0e34cf3b4d5fdf3aacf73221d48fc1e1d4924f770ffa'}]}, 'timestamp': '2025-10-02 12:17:17.182019', '_unique_id': '95bdce1bb2034e2aa956b4e219d705f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.182 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.184 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.184 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.185 12 DEBUG ceilometer.compute.pollsters [-] c0859f36-89c4-4534-aba5-d5373464c64f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.185 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.write.bytes volume: 72712192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.186 12 DEBUG ceilometer.compute.pollsters [-] 6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c96d8023-c704-436d-83f1-9162c8820b4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vda', 'timestamp': '2025-10-02T12:17:17.184019', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc97200c-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': '994a3d6a13b60fe513c3c960b023a7380d2cebde91f9277ed215f0f1372a279e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-vdb', 'timestamp': '2025-10-02T12:17:17.184019', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'bc972cf0-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'c9b621ce34f91b30373f6c0d7488993084bb3b939a80d051cb72a148e58c8073'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ebfa5d25510d4f4f8b7c9a6cf0b8c9b1', 'user_name': None, 'project_id': '999a48d299e548dfa3ec622cf07f7017', 'project_name': None, 'resource_id': 'c0859f36-89c4-4534-aba5-d5373464c64f-sda', 'timestamp': '2025-10-02T12:17:17.184019', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-354540198', 'name': 'instance-00000054', 'instance_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'instance_type': 'm1.nano', 'host': '900c6ec388e784c7dcb88032c41e0bcfdd2b0a5d12d71f93a2e23eef', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc975842-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.702975079, 'message_signature': 'ca77dc8aff403ce94bce1b8d95d571ef4019171b73bc6959a32332c6ea7116d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72712192, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-vda', 'timestamp': '2025-10-02T12:17:17.184019', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc97671a-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '5372314ff94103fe1040c87246404411b853f85b83dc199d68270c26e7ce7188'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '001d2d51902d4e299b775131f430a5db', 'user_name': None, 'project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'project_name': None, 'resource_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221-sda', 'timestamp': '2025-10-02T12:17:17.184019', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-375307851', 'name': 'instance-00000056', 'instance_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'instance_type': 'm1.nano', 'host': '9195e2634b424931cce7f0319c6ff032bae340eec96d7f718c88c645', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '062d9f80-76b6-42ce-bee7-0fb82a008353'}, 'image_ref': '062d9f80-76b6-42ce-bee7-0fb82a008353', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc977372-9f89-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5430.787626895, 'message_signature': '0f84ebd7bc43725254cc784296d259e0efbc84da514346faad99399458982d25'}]}, 'timestamp': '2025-10-02 12:17:17.186507', '_unique_id': 'f64cedc1361f41e6ad1c2c55f85133f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:17:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:17:17.187 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:17:18 np0005466012 podman[231999]: 2025-10-02 12:17:18.158266663 +0000 UTC m=+0.066419074 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.428 2 DEBUG nova.compute.manager [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.428 2 DEBUG nova.compute.manager [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing instance network info cache due to event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.429 2 DEBUG oslo_concurrency.lockutils [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.429 2 DEBUG oslo_concurrency.lockutils [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.430 2 DEBUG nova.network.neutron [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:19 np0005466012 nova_compute[192063]: 2025-10-02 12:17:19.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:21 np0005466012 podman[232017]: 2025-10-02 12:17:21.173952272 +0000 UTC m=+0.076724109 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.651 2 DEBUG nova.compute.manager [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.651 2 DEBUG nova.compute.manager [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing instance network info cache due to event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.651 2 DEBUG oslo_concurrency.lockutils [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.820 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:21 np0005466012 nova_compute[192063]: 2025-10-02 12:17:21.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:22 np0005466012 nova_compute[192063]: 2025-10-02 12:17:22.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.686 2 DEBUG nova.network.neutron [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updated VIF entry in instance network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.687 2 DEBUG nova.network.neutron [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.730 2 DEBUG oslo_concurrency.lockutils [req-e1096a68-7334-4da3-abe3-d09d8d676ac6 req-ea7810ac-d03b-47fd-bd36-3a9e14b1a0bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.731 2 DEBUG oslo_concurrency.lockutils [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.732 2 DEBUG nova.network.neutron [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.850 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:17:23 np0005466012 nova_compute[192063]: 2025-10-02 12:17:23.958 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.023 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.024 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.085 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk.rescue --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.086 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.142 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.143 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.196 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.202 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.260 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.261 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.320 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.467 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.468 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5376MB free_disk=73.33003234863281GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.468 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.469 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.592 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance c0859f36-89c4-4534-aba5-d5373464c64f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.593 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 6f5eb8d8-6d7c-4666-ace5-49baf3909221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.593 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.593 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.734 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.788 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.834 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:17:24 np0005466012 nova_compute[192063]: 2025-10-02 12:17:24.835 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:25 np0005466012 nova_compute[192063]: 2025-10-02 12:17:25.835 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:25 np0005466012 nova_compute[192063]: 2025-10-02 12:17:25.836 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:25 np0005466012 nova_compute[192063]: 2025-10-02 12:17:25.836 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:17:26 np0005466012 nova_compute[192063]: 2025-10-02 12:17:26.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:26 np0005466012 nova_compute[192063]: 2025-10-02 12:17:26.441 2 DEBUG nova.network.neutron [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updated VIF entry in instance network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:26 np0005466012 nova_compute[192063]: 2025-10-02 12:17:26.441 2 DEBUG nova.network.neutron [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:26 np0005466012 nova_compute[192063]: 2025-10-02 12:17:26.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:26 np0005466012 nova_compute[192063]: 2025-10-02 12:17:26.481 2 DEBUG oslo_concurrency.lockutils [req-292ba355-4ae0-430c-8b97-eb8f958555a6 req-a349ef56-cf02-4ee2-9aa4-2c6b2377dbfb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:26.482 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:26.483 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:17:27 np0005466012 podman[232069]: 2025-10-02 12:17:27.141077487 +0000 UTC m=+0.057152159 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git)
Oct  2 08:17:27 np0005466012 podman[232068]: 2025-10-02 12:17:27.148149362 +0000 UTC m=+0.065990542 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:17:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:28.485 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:28 np0005466012 nova_compute[192063]: 2025-10-02 12:17:28.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:28 np0005466012 NetworkManager[51207]: <info>  [1759407448.8503] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Oct  2 08:17:28 np0005466012 NetworkManager[51207]: <info>  [1759407448.8509] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Oct  2 08:17:28 np0005466012 nova_compute[192063]: 2025-10-02 12:17:28.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:28Z|00314|binding|INFO|Releasing lport 1bd1cb43-f90b-4e8c-92cc-e89ec36a0b0f from this chassis (sb_readonly=0)
Oct  2 08:17:28 np0005466012 nova_compute[192063]: 2025-10-02 12:17:28.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:29 np0005466012 nova_compute[192063]: 2025-10-02 12:17:29.103 2 DEBUG nova.compute.manager [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:29 np0005466012 nova_compute[192063]: 2025-10-02 12:17:29.104 2 DEBUG nova.compute.manager [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing instance network info cache due to event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:29 np0005466012 nova_compute[192063]: 2025-10-02 12:17:29.104 2 DEBUG oslo_concurrency.lockutils [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:29 np0005466012 nova_compute[192063]: 2025-10-02 12:17:29.104 2 DEBUG oslo_concurrency.lockutils [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:29 np0005466012 nova_compute[192063]: 2025-10-02 12:17:29.104 2 DEBUG nova.network.neutron [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:29 np0005466012 nova_compute[192063]: 2025-10-02 12:17:29.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:31 np0005466012 podman[232108]: 2025-10-02 12:17:31.154777513 +0000 UTC m=+0.061670583 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:17:31 np0005466012 podman[232107]: 2025-10-02 12:17:31.158212108 +0000 UTC m=+0.068149402 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:17:31 np0005466012 nova_compute[192063]: 2025-10-02 12:17:31.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:31 np0005466012 nova_compute[192063]: 2025-10-02 12:17:31.938 2 DEBUG nova.network.neutron [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updated VIF entry in instance network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:31 np0005466012 nova_compute[192063]: 2025-10-02 12:17:31.939 2 DEBUG nova.network.neutron [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:33 np0005466012 nova_compute[192063]: 2025-10-02 12:17:33.471 2 DEBUG oslo_concurrency.lockutils [req-69394710-26bf-4250-865c-117d2d48859d req-371a4bac-d96e-4511-a8ee-5f2051e22410 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:33 np0005466012 nova_compute[192063]: 2025-10-02 12:17:33.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:17:33 np0005466012 nova_compute[192063]: 2025-10-02 12:17:33.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:17:33 np0005466012 nova_compute[192063]: 2025-10-02 12:17:33.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:17:34 np0005466012 nova_compute[192063]: 2025-10-02 12:17:34.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:34 np0005466012 nova_compute[192063]: 2025-10-02 12:17:34.142 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:34 np0005466012 nova_compute[192063]: 2025-10-02 12:17:34.142 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:34 np0005466012 nova_compute[192063]: 2025-10-02 12:17:34.143 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:17:34 np0005466012 nova_compute[192063]: 2025-10-02 12:17:34.143 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:34Z|00315|binding|INFO|Releasing lport 1bd1cb43-f90b-4e8c-92cc-e89ec36a0b0f from this chassis (sb_readonly=0)
Oct  2 08:17:34 np0005466012 nova_compute[192063]: 2025-10-02 12:17:34.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:36 np0005466012 nova_compute[192063]: 2025-10-02 12:17:36.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:37 np0005466012 nova_compute[192063]: 2025-10-02 12:17:37.061 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:38 np0005466012 nova_compute[192063]: 2025-10-02 12:17:38.648 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:38 np0005466012 nova_compute[192063]: 2025-10-02 12:17:38.649 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:17:39 np0005466012 nova_compute[192063]: 2025-10-02 12:17:39.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:40 np0005466012 nova_compute[192063]: 2025-10-02 12:17:40.264 2 DEBUG nova.compute.manager [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:40 np0005466012 nova_compute[192063]: 2025-10-02 12:17:40.265 2 DEBUG nova.compute.manager [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing instance network info cache due to event network-changed-d0f247a6-f336-4b2f-a423-7b03a80a5228. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:17:40 np0005466012 nova_compute[192063]: 2025-10-02 12:17:40.265 2 DEBUG oslo_concurrency.lockutils [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:17:40 np0005466012 nova_compute[192063]: 2025-10-02 12:17:40.266 2 DEBUG oslo_concurrency.lockutils [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:17:40 np0005466012 nova_compute[192063]: 2025-10-02 12:17:40.266 2 DEBUG nova.network.neutron [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Refreshing network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:17:41 np0005466012 nova_compute[192063]: 2025-10-02 12:17:41.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:44 np0005466012 nova_compute[192063]: 2025-10-02 12:17:44.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:45 np0005466012 nova_compute[192063]: 2025-10-02 12:17:45.225 2 DEBUG nova.network.neutron [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updated VIF entry in instance network info cache for port d0f247a6-f336-4b2f-a423-7b03a80a5228. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:17:45 np0005466012 nova_compute[192063]: 2025-10-02 12:17:45.226 2 DEBUG nova.network.neutron [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [{"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:45 np0005466012 nova_compute[192063]: 2025-10-02 12:17:45.262 2 DEBUG oslo_concurrency.lockutils [req-6412322b-6138-4a53-b178-81548c89eb7d req-8c867884-2ad7-4f3a-bc20-d628575a0942 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-c0859f36-89c4-4534-aba5-d5373464c64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:17:46 np0005466012 podman[232151]: 2025-10-02 12:17:46.187493816 +0000 UTC m=+0.086308062 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:17:46 np0005466012 podman[232152]: 2025-10-02 12:17:46.240597241 +0000 UTC m=+0.137247027 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 08:17:46 np0005466012 nova_compute[192063]: 2025-10-02 12:17:46.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.728 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.729 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.729 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.729 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.730 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.745 2 INFO nova.compute.manager [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Terminating instance#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.761 2 DEBUG nova.compute.manager [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:17:48 np0005466012 kernel: tapd0f247a6-f3 (unregistering): left promiscuous mode
Oct  2 08:17:48 np0005466012 NetworkManager[51207]: <info>  [1759407468.7998] device (tapd0f247a6-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:48Z|00316|binding|INFO|Releasing lport d0f247a6-f336-4b2f-a423-7b03a80a5228 from this chassis (sb_readonly=0)
Oct  2 08:17:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:48Z|00317|binding|INFO|Setting lport d0f247a6-f336-4b2f-a423-7b03a80a5228 down in Southbound
Oct  2 08:17:48 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:48Z|00318|binding|INFO|Removing iface tapd0f247a6-f3 ovn-installed in OVS
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:48.831 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:a3:d0 10.100.0.10'], port_security=['fa:16:3e:5a:a3:d0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0859f36-89c4-4534-aba5-d5373464c64f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3cafb9b-5ea3-48cb-b4f5-616692db21f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '999a48d299e548dfa3ec622cf07f7017', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b61602f4-509b-434a-8fef-8040306ea771', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6778b39e-f647-49e5-a839-dee5291ea3a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d0f247a6-f336-4b2f-a423-7b03a80a5228) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:48.833 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d0f247a6-f336-4b2f-a423-7b03a80a5228 in datapath d3cafb9b-5ea3-48cb-b4f5-616692db21f6 unbound from our chassis#033[00m
Oct  2 08:17:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:48.834 103246 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d3cafb9b-5ea3-48cb-b4f5-616692db21f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  2 08:17:48 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:48.835 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[092fa3c4-1db3-46b5-882d-25693ead0cd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:48 np0005466012 nova_compute[192063]: 2025-10-02 12:17:48.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:48 np0005466012 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000054.scope: Deactivated successfully.
Oct  2 08:17:48 np0005466012 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000054.scope: Consumed 14.369s CPU time.
Oct  2 08:17:48 np0005466012 systemd-machined[152114]: Machine qemu-37-instance-00000054 terminated.
Oct  2 08:17:48 np0005466012 podman[232202]: 2025-10-02 12:17:48.930502412 +0000 UTC m=+0.085133161 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.069 2 INFO nova.virt.libvirt.driver [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Instance destroyed successfully.#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.070 2 DEBUG nova.objects.instance [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lazy-loading 'resources' on Instance uuid c0859f36-89c4-4534-aba5-d5373464c64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.096 2 DEBUG nova.virt.libvirt.vif [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-354540198',display_name='tempest-ServerRescueTestJSONUnderV235-server-354540198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-354540198',id=84,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:17:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='999a48d299e548dfa3ec622cf07f7017',ramdisk_id='',reservation_id='r-rovb30q3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-125577537',owner_user_name='tempest-ServerRescueTestJSONUnderV235-125577537-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:17:11Z,user_data=None,user_id='ebfa5d25510d4f4f8b7c9a6cf0b8c9b1',uuid=c0859f36-89c4-4534-aba5-d5373464c64f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.097 2 DEBUG nova.network.os_vif_util [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converting VIF {"id": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "address": "fa:16:3e:5a:a3:d0", "network": {"id": "d3cafb9b-5ea3-48cb-b4f5-616692db21f6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1899834581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "999a48d299e548dfa3ec622cf07f7017", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0f247a6-f3", "ovs_interfaceid": "d0f247a6-f336-4b2f-a423-7b03a80a5228", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.098 2 DEBUG nova.network.os_vif_util [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.099 2 DEBUG os_vif [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.101 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0f247a6-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.109 2 INFO os_vif [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:a3:d0,bridge_name='br-int',has_traffic_filtering=True,id=d0f247a6-f336-4b2f-a423-7b03a80a5228,network=Network(d3cafb9b-5ea3-48cb-b4f5-616692db21f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0f247a6-f3')#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.111 2 INFO nova.virt.libvirt.driver [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Deleting instance files /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f_del#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.112 2 INFO nova.virt.libvirt.driver [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Deletion of /var/lib/nova/instances/c0859f36-89c4-4534-aba5-d5373464c64f_del complete#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.264 2 DEBUG nova.compute.manager [req-dd79ca4c-5885-4df1-bae1-c0ac746285ef req-650dfd7d-6e81-4753-8d70-7cc87f382e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-unplugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.265 2 DEBUG oslo_concurrency.lockutils [req-dd79ca4c-5885-4df1-bae1-c0ac746285ef req-650dfd7d-6e81-4753-8d70-7cc87f382e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.266 2 DEBUG oslo_concurrency.lockutils [req-dd79ca4c-5885-4df1-bae1-c0ac746285ef req-650dfd7d-6e81-4753-8d70-7cc87f382e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.266 2 DEBUG oslo_concurrency.lockutils [req-dd79ca4c-5885-4df1-bae1-c0ac746285ef req-650dfd7d-6e81-4753-8d70-7cc87f382e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.267 2 DEBUG nova.compute.manager [req-dd79ca4c-5885-4df1-bae1-c0ac746285ef req-650dfd7d-6e81-4753-8d70-7cc87f382e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-unplugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.267 2 DEBUG nova.compute.manager [req-dd79ca4c-5885-4df1-bae1-c0ac746285ef req-650dfd7d-6e81-4753-8d70-7cc87f382e3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-unplugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.287 2 INFO nova.compute.manager [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.288 2 DEBUG oslo.service.loopingcall [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.288 2 DEBUG nova.compute.manager [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:17:49 np0005466012 nova_compute[192063]: 2025-10-02 12:17:49.289 2 DEBUG nova.network.neutron [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.519 2 DEBUG nova.network.neutron [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.544 2 INFO nova.compute.manager [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Took 1.26 seconds to deallocate network for instance.#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.692 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.693 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.781 2 DEBUG nova.compute.manager [req-08168f51-9a37-4283-9df5-7a584cb92e03 req-bf87ce2f-eb72-4d55-ac37-5df8455f1d36 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-deleted-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.819 2 DEBUG nova.compute.provider_tree [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.852 2 DEBUG nova.scheduler.client.report [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.910 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:50 np0005466012 nova_compute[192063]: 2025-10-02 12:17:50.952 2 INFO nova.scheduler.client.report [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Deleted allocations for instance c0859f36-89c4-4534-aba5-d5373464c64f#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.056 2 DEBUG oslo_concurrency.lockutils [None req-f2bbc096-18a6-4065-ba8d-028d9fba58f3 ebfa5d25510d4f4f8b7c9a6cf0b8c9b1 999a48d299e548dfa3ec622cf07f7017 - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.376 2 DEBUG nova.compute.manager [req-538093c9-77f0-4511-8923-55689a2b3ceb req-13a1802c-9cb8-4874-abb9-5405f5a25e51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.376 2 DEBUG oslo_concurrency.lockutils [req-538093c9-77f0-4511-8923-55689a2b3ceb req-13a1802c-9cb8-4874-abb9-5405f5a25e51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.376 2 DEBUG oslo_concurrency.lockutils [req-538093c9-77f0-4511-8923-55689a2b3ceb req-13a1802c-9cb8-4874-abb9-5405f5a25e51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.377 2 DEBUG oslo_concurrency.lockutils [req-538093c9-77f0-4511-8923-55689a2b3ceb req-13a1802c-9cb8-4874-abb9-5405f5a25e51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "c0859f36-89c4-4534-aba5-d5373464c64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.377 2 DEBUG nova.compute.manager [req-538093c9-77f0-4511-8923-55689a2b3ceb req-13a1802c-9cb8-4874-abb9-5405f5a25e51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] No waiting events found dispatching network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.377 2 WARNING nova.compute.manager [req-538093c9-77f0-4511-8923-55689a2b3ceb req-13a1802c-9cb8-4874-abb9-5405f5a25e51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Received unexpected event network-vif-plugged-d0f247a6-f336-4b2f-a423-7b03a80a5228 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:17:51 np0005466012 nova_compute[192063]: 2025-10-02 12:17:51.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.168 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.168 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.169 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.169 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.169 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:52 np0005466012 podman[232248]: 2025-10-02 12:17:52.193491663 +0000 UTC m=+0.083842294 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.271 2 INFO nova.compute.manager [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Terminating instance#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.290 2 DEBUG nova.compute.manager [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:17:52 np0005466012 kernel: tap5b2477aa-81 (unregistering): left promiscuous mode
Oct  2 08:17:52 np0005466012 NetworkManager[51207]: <info>  [1759407472.3298] device (tap5b2477aa-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:17:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:52Z|00319|binding|INFO|Releasing lport 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 from this chassis (sb_readonly=0)
Oct  2 08:17:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:52Z|00320|binding|INFO|Setting lport 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 down in Southbound
Oct  2 08:17:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:17:52Z|00321|binding|INFO|Removing iface tap5b2477aa-81 ovn-installed in OVS
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.430 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:03:1f 10.100.0.9'], port_security=['fa:16:3e:ff:03:1f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6f5eb8d8-6d7c-4666-ace5-49baf3909221', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e0277f0bb0f4a349e2e6d8ddfa24edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4e0bc42-3cfd-4f42-a319-553606576b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a043239b-039e-45fa-8277-43e361a8bae7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5b2477aa-8155-4265-b3e5-0fd41f6b2d83) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.432 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5b2477aa-8155-4265-b3e5-0fd41f6b2d83 in datapath bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 unbound from our chassis#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.433 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.434 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f9100b-61ad-4e92-b3f8-aac6d9c3027e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.435 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 namespace which is not needed anymore#033[00m
Oct  2 08:17:52 np0005466012 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000056.scope: Deactivated successfully.
Oct  2 08:17:52 np0005466012 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000056.scope: Consumed 16.357s CPU time.
Oct  2 08:17:52 np0005466012 systemd-machined[152114]: Machine qemu-36-instance-00000056 terminated.
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.567 2 INFO nova.virt.libvirt.driver [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Instance destroyed successfully.#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.568 2 DEBUG nova.objects.instance [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lazy-loading 'resources' on Instance uuid 6f5eb8d8-6d7c-4666-ace5-49baf3909221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:17:52 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [NOTICE]   (231861) : haproxy version is 2.8.14-c23fe91
Oct  2 08:17:52 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [NOTICE]   (231861) : path to executable is /usr/sbin/haproxy
Oct  2 08:17:52 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [WARNING]  (231861) : Exiting Master process...
Oct  2 08:17:52 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [ALERT]    (231861) : Current worker (231863) exited with code 143 (Terminated)
Oct  2 08:17:52 np0005466012 neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5[231857]: [WARNING]  (231861) : All workers exited. Exiting... (0)
Oct  2 08:17:52 np0005466012 systemd[1]: libpod-2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26.scope: Deactivated successfully.
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.586 2 DEBUG nova.virt.libvirt.vif [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-375307851',display_name='tempest-ListServerFiltersTestJSON-instance-375307851',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-375307851',id=86,image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:17:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e0277f0bb0f4a349e2e6d8ddfa24edf',ramdisk_id='',reservation_id='r-t8q007lp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='062d9f80-76b6-42ce-bee7-0fb82a008353',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-298715262',owner_user_name='tempest-ListServerFiltersTestJSON-298715262-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:17:02Z,user_data=None,user_id='001d2d51902d4e299b775131f430a5db',uuid=6f5eb8d8-6d7c-4666-ace5-49baf3909221,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.586 2 DEBUG nova.network.os_vif_util [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converting VIF {"id": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "address": "fa:16:3e:ff:03:1f", "network": {"id": "bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-542543245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e0277f0bb0f4a349e2e6d8ddfa24edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b2477aa-81", "ovs_interfaceid": "5b2477aa-8155-4265-b3e5-0fd41f6b2d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.587 2 DEBUG nova.network.os_vif_util [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.587 2 DEBUG os_vif [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.588 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b2477aa-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:52 np0005466012 podman[232293]: 2025-10-02 12:17:52.592100882 +0000 UTC m=+0.056807019 container died 2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.594 2 INFO os_vif [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:03:1f,bridge_name='br-int',has_traffic_filtering=True,id=5b2477aa-8155-4265-b3e5-0fd41f6b2d83,network=Network(bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b2477aa-81')#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.594 2 INFO nova.virt.libvirt.driver [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Deleting instance files /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221_del#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.595 2 INFO nova.virt.libvirt.driver [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Deletion of /var/lib/nova/instances/6f5eb8d8-6d7c-4666-ace5-49baf3909221_del complete#033[00m
Oct  2 08:17:52 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26-userdata-shm.mount: Deactivated successfully.
Oct  2 08:17:52 np0005466012 systemd[1]: var-lib-containers-storage-overlay-a35145200f21a059117e8589ed6ea42797937a38c3a5cfa8dd8ba7bae7729ce1-merged.mount: Deactivated successfully.
Oct  2 08:17:52 np0005466012 podman[232293]: 2025-10-02 12:17:52.632325932 +0000 UTC m=+0.097032079 container cleanup 2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:17:52 np0005466012 systemd[1]: libpod-conmon-2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26.scope: Deactivated successfully.
Oct  2 08:17:52 np0005466012 podman[232331]: 2025-10-02 12:17:52.705172952 +0000 UTC m=+0.048544270 container remove 2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.712 2 INFO nova.compute.manager [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.713 2 DEBUG oslo.service.loopingcall [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.712 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[60447403-ee68-49fc-9470-b7cbd21a4584]: (4, ('Thu Oct  2 12:17:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 (2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26)\n2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26\nThu Oct  2 12:17:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 (2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26)\n2cdc7fbdfcc1ef6b85245c69ddc8408be7e29652cd137b3d1f7685802d346e26\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.713 2 DEBUG nova.compute.manager [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.714 2 DEBUG nova.network.neutron [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.714 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e1d307-32c1-4f5f-b2fe-5a1fb8d9cca9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.715 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd543a6a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 kernel: tapbd543a6a-b0: left promiscuous mode
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 nova_compute[192063]: 2025-10-02 12:17:52.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.745 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e401361c-41c4-4ab8-a94e-e9832f052a6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.776 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[83acb82f-70d6-4172-afa1-1e4c908b12bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.778 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[84559891-3ae7-4eb6-bf1d-bb576a089101]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.795 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e286819d-187c-487b-bdf3-1beab1fc72eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541498, 'reachable_time': 27824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232346, 'error': None, 'target': 'ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.799 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd543a6a-bba1-4bd5-9cbf-fc87bf95cbe5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:17:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:17:52.799 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[46297573-1712-4bb6-862b-3ce8b6ee4226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:17:52 np0005466012 systemd[1]: run-netns-ovnmeta\x2dbd543a6a\x2dbba1\x2d4bd5\x2d9cbf\x2dfc87bf95cbe5.mount: Deactivated successfully.
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.631 2 DEBUG nova.compute.manager [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-vif-unplugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.631 2 DEBUG oslo_concurrency.lockutils [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.632 2 DEBUG oslo_concurrency.lockutils [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.632 2 DEBUG oslo_concurrency.lockutils [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.632 2 DEBUG nova.compute.manager [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] No waiting events found dispatching network-vif-unplugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.633 2 DEBUG nova.compute.manager [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-vif-unplugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.633 2 DEBUG nova.compute.manager [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.633 2 DEBUG oslo_concurrency.lockutils [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.634 2 DEBUG oslo_concurrency.lockutils [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.634 2 DEBUG oslo_concurrency.lockutils [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.634 2 DEBUG nova.compute.manager [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] No waiting events found dispatching network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.635 2 WARNING nova.compute.manager [req-d11601a3-40d9-49b7-9867-92804a60f56c req-3b328b15-9f52-4243-93d0-27da5dab5f6d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received unexpected event network-vif-plugged-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.709 2 DEBUG nova.network.neutron [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.743 2 INFO nova.compute.manager [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Took 1.03 seconds to deallocate network for instance.#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.791 2 DEBUG nova.compute.manager [req-ebd73ce5-28c3-4989-95de-f45c10a1309e req-f02e2326-1cdb-4556-85c0-758f6ebeb146 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Received event network-vif-deleted-5b2477aa-8155-4265-b3e5-0fd41f6b2d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.873 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.874 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.933 2 DEBUG nova.compute.provider_tree [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:17:53 np0005466012 nova_compute[192063]: 2025-10-02 12:17:53.958 2 DEBUG nova.scheduler.client.report [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:17:54 np0005466012 nova_compute[192063]: 2025-10-02 12:17:54.007 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:54 np0005466012 nova_compute[192063]: 2025-10-02 12:17:54.056 2 INFO nova.scheduler.client.report [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Deleted allocations for instance 6f5eb8d8-6d7c-4666-ace5-49baf3909221#033[00m
Oct  2 08:17:54 np0005466012 nova_compute[192063]: 2025-10-02 12:17:54.214 2 DEBUG oslo_concurrency.lockutils [None req-290cffe7-5c22-445b-a3e6-ec5d1e90a76a 001d2d51902d4e299b775131f430a5db 6e0277f0bb0f4a349e2e6d8ddfa24edf - - default default] Lock "6f5eb8d8-6d7c-4666-ace5-49baf3909221" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:17:54 np0005466012 nova_compute[192063]: 2025-10-02 12:17:54.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:56 np0005466012 nova_compute[192063]: 2025-10-02 12:17:56.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:56 np0005466012 nova_compute[192063]: 2025-10-02 12:17:56.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:56 np0005466012 nova_compute[192063]: 2025-10-02 12:17:56.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:57 np0005466012 nova_compute[192063]: 2025-10-02 12:17:57.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:17:58 np0005466012 podman[232349]: 2025-10-02 12:17:58.176542509 +0000 UTC m=+0.080616256 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:17:58 np0005466012 podman[232348]: 2025-10-02 12:17:58.188451767 +0000 UTC m=+0.088023509 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:18:01 np0005466012 nova_compute[192063]: 2025-10-02 12:18:01.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:02.128 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:02.129 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:02.129 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:02 np0005466012 podman[232388]: 2025-10-02 12:18:02.151432314 +0000 UTC m=+0.070024123 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:18:02 np0005466012 podman[232387]: 2025-10-02 12:18:02.151435094 +0000 UTC m=+0.063766981 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2)
Oct  2 08:18:02 np0005466012 nova_compute[192063]: 2025-10-02 12:18:02.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:04 np0005466012 nova_compute[192063]: 2025-10-02 12:18:04.067 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407469.065792, c0859f36-89c4-4534-aba5-d5373464c64f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:04 np0005466012 nova_compute[192063]: 2025-10-02 12:18:04.068 2 INFO nova.compute.manager [-] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:18:04 np0005466012 nova_compute[192063]: 2025-10-02 12:18:04.094 2 DEBUG nova.compute.manager [None req-359436d7-2c62-43ed-8625-da04e2bfef54 - - - - - -] [instance: c0859f36-89c4-4534-aba5-d5373464c64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:06 np0005466012 nova_compute[192063]: 2025-10-02 12:18:06.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:07 np0005466012 nova_compute[192063]: 2025-10-02 12:18:07.565 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407472.5644042, 6f5eb8d8-6d7c-4666-ace5-49baf3909221 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:07 np0005466012 nova_compute[192063]: 2025-10-02 12:18:07.566 2 INFO nova.compute.manager [-] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:18:07 np0005466012 nova_compute[192063]: 2025-10-02 12:18:07.586 2 DEBUG nova.compute.manager [None req-20b801e4-3aac-4545-bb54-c3446f785ab6 - - - - - -] [instance: 6f5eb8d8-6d7c-4666-ace5-49baf3909221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:07 np0005466012 nova_compute[192063]: 2025-10-02 12:18:07.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:11 np0005466012 nova_compute[192063]: 2025-10-02 12:18:11.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:12 np0005466012 nova_compute[192063]: 2025-10-02 12:18:12.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:16 np0005466012 nova_compute[192063]: 2025-10-02 12:18:16.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:17 np0005466012 podman[232431]: 2025-10-02 12:18:17.143408649 +0000 UTC m=+0.056106222 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:18:17 np0005466012 podman[232432]: 2025-10-02 12:18:17.228973776 +0000 UTC m=+0.140638120 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:18:17 np0005466012 nova_compute[192063]: 2025-10-02 12:18:17.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:18 np0005466012 nova_compute[192063]: 2025-10-02 12:18:18.951 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:18 np0005466012 nova_compute[192063]: 2025-10-02 12:18:18.952 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:18 np0005466012 nova_compute[192063]: 2025-10-02 12:18:18.968 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.084 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.085 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.093 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.093 2 INFO nova.compute.claims [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:18:19 np0005466012 podman[232481]: 2025-10-02 12:18:19.148474128 +0000 UTC m=+0.061966151 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.249 2 DEBUG nova.compute.provider_tree [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.268 2 DEBUG nova.scheduler.client.report [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.293 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.293 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.306 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.307 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.328 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.381 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.382 2 DEBUG nova.network.neutron [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.423 2 INFO nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.462 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.471 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.471 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.480 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.480 2 INFO nova.compute.claims [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.750 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.752 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.753 2 INFO nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Creating image(s)#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.754 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.755 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.756 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.790 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.817 2 DEBUG nova.policy [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8c91fa3e559044609ddabc81368d7546', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa03c570c52a4c2a9445090389d03c6d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.833 2 DEBUG nova.compute.provider_tree [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.861 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.863 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.864 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.888 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.914 2 DEBUG nova.scheduler.client.report [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.946 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.947 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.958 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:19 np0005466012 nova_compute[192063]: 2025-10-02 12:18:19.959 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.011 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.013 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.014 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.046 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.047 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.091 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.092 2 DEBUG nova.virt.disk.api [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Checking if we can resize image /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.093 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.164 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.166 2 DEBUG nova.virt.disk.api [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Cannot resize image /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.167 2 DEBUG nova.objects.instance [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'migration_context' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.174 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.421 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.421 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Ensure instance console log exists: /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.422 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.422 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.422 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.434 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.445 2 DEBUG nova.policy [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '836c60c20a0f48dd994c9d659781fc06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '49c6a5f4c4c84d7ba686d98befbc981a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:18:20 np0005466012 nova_compute[192063]: 2025-10-02 12:18:20.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.159 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.161 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.162 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Creating image(s)#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.163 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "/var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.164 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "/var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.165 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "/var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.196 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.293 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.295 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.296 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.320 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.362 2 DEBUG nova.network.neutron [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Successfully created port: de78e7b6-f8b9-40fb-bc85-0a8257f52c55 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.399 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.400 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.437 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.439 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.439 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.499 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.500 2 DEBUG nova.virt.disk.api [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Checking if we can resize image /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.501 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.560 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.562 2 DEBUG nova.virt.disk.api [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Cannot resize image /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.562 2 DEBUG nova.objects.instance [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lazy-loading 'migration_context' on Instance uuid cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.575 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.576 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Ensure instance console log exists: /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.576 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.577 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:21 np0005466012 nova_compute[192063]: 2025-10-02 12:18:21.577 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:22 np0005466012 nova_compute[192063]: 2025-10-02 12:18:22.099 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Successfully created port: e5fcdd20-03b8-47df-8838-af34f97abda2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:18:22 np0005466012 nova_compute[192063]: 2025-10-02 12:18:22.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:22 np0005466012 nova_compute[192063]: 2025-10-02 12:18:22.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:23 np0005466012 podman[232530]: 2025-10-02 12:18:23.161877483 +0000 UTC m=+0.071282909 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.356 2 DEBUG nova.network.neutron [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Successfully updated port: de78e7b6-f8b9-40fb-bc85-0a8257f52c55 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.413 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.413 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquired lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.414 2 DEBUG nova.network.neutron [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.795 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Successfully updated port: e5fcdd20-03b8-47df-8838-af34f97abda2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.819 2 DEBUG nova.network.neutron [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.894 2 DEBUG nova.compute.manager [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-changed-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.895 2 DEBUG nova.compute.manager [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Refreshing instance network info cache due to event network-changed-de78e7b6-f8b9-40fb-bc85-0a8257f52c55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.895 2 DEBUG oslo_concurrency.lockutils [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.902 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "refresh_cache-cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.903 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquired lock "refresh_cache-cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.903 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:23 np0005466012 nova_compute[192063]: 2025-10-02 12:18:23.908 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:24 np0005466012 nova_compute[192063]: 2025-10-02 12:18:24.153 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:18:24 np0005466012 nova_compute[192063]: 2025-10-02 12:18:24.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:24 np0005466012 nova_compute[192063]: 2025-10-02 12:18:24.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:24 np0005466012 nova_compute[192063]: 2025-10-02 12:18:24.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.338 2 DEBUG nova.compute.manager [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-changed-e5fcdd20-03b8-47df-8838-af34f97abda2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.339 2 DEBUG nova.compute.manager [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Refreshing instance network info cache due to event network-changed-e5fcdd20-03b8-47df-8838-af34f97abda2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.340 2 DEBUG oslo_concurrency.lockutils [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.656 2 DEBUG nova.network.neutron [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Updating instance_info_cache with network_info: [{"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.676 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Releasing lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.676 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance network_info: |[{"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.676 2 DEBUG oslo_concurrency.lockutils [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.677 2 DEBUG nova.network.neutron [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Refreshing network info cache for port de78e7b6-f8b9-40fb-bc85-0a8257f52c55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.679 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Start _get_guest_xml network_info=[{"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.683 2 WARNING nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.688 2 DEBUG nova.virt.libvirt.host [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.688 2 DEBUG nova.virt.libvirt.host [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.691 2 DEBUG nova.virt.libvirt.host [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.692 2 DEBUG nova.virt.libvirt.host [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.693 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.693 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.694 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.694 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.694 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.694 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.694 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.694 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.695 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.695 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.695 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.695 2 DEBUG nova.virt.hardware [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.698 2 DEBUG nova.virt.libvirt.vif [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1845813479',display_name='tempest-ServerRescueNegativeTestJSON-server-1845813479',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1845813479',id=91,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa03c570c52a4c2a9445090389d03c6d',ramdisk_id='',reservation_id='r-n02x5bvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1968496116',owner_user_name='tempest-ServerRescueNegativeTestJSON-1968496116-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:19Z,user_data=None,user_id='8c91fa3e559044609ddabc81368d7546',uuid=fbfcbe40-57a4-4e81-a4a2-bc9c241749fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.699 2 DEBUG nova.network.os_vif_util [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converting VIF {"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.699 2 DEBUG nova.network.os_vif_util [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.701 2 DEBUG nova.objects.instance [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'pci_devices' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.713 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <uuid>fbfcbe40-57a4-4e81-a4a2-bc9c241749fc</uuid>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <name>instance-0000005b</name>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1845813479</nova:name>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:18:25</nova:creationTime>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:user uuid="8c91fa3e559044609ddabc81368d7546">tempest-ServerRescueNegativeTestJSON-1968496116-project-member</nova:user>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:project uuid="fa03c570c52a4c2a9445090389d03c6d">tempest-ServerRescueNegativeTestJSON-1968496116</nova:project>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        <nova:port uuid="de78e7b6-f8b9-40fb-bc85-0a8257f52c55">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <entry name="serial">fbfcbe40-57a4-4e81-a4a2-bc9c241749fc</entry>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <entry name="uuid">fbfcbe40-57a4-4e81-a4a2-bc9c241749fc</entry>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:49:12:9b"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <target dev="tapde78e7b6-f8"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/console.log" append="off"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:18:25 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:18:25 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:18:25 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:18:25 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.714 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Preparing to wait for external event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.715 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.715 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.715 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.716 2 DEBUG nova.virt.libvirt.vif [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1845813479',display_name='tempest-ServerRescueNegativeTestJSON-server-1845813479',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1845813479',id=91,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa03c570c52a4c2a9445090389d03c6d',ramdisk_id='',reservation_id='r-n02x5bvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1968496116',owner_user_name='tempest-ServerRescueNegativeTestJSON-1968496116-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:19Z,user_data=None,user_id='8c91fa3e559044609ddabc81368d7546',uuid=fbfcbe40-57a4-4e81-a4a2-bc9c241749fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.716 2 DEBUG nova.network.os_vif_util [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converting VIF {"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.716 2 DEBUG nova.network.os_vif_util [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.717 2 DEBUG os_vif [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.718 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.718 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde78e7b6-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapde78e7b6-f8, col_values=(('external_ids', {'iface-id': 'de78e7b6-f8b9-40fb-bc85-0a8257f52c55', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:12:9b', 'vm-uuid': 'fbfcbe40-57a4-4e81-a4a2-bc9c241749fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466012 NetworkManager[51207]: <info>  [1759407505.7238] manager: (tapde78e7b6-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.730 2 INFO os_vif [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8')#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.788 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.790 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.790 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No VIF found with MAC fa:16:3e:49:12:9b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.791 2 INFO nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Using config drive#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.841 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.841 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.841 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.842 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.895 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.987 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:25 np0005466012 nova_compute[192063]: 2025-10-02 12:18:25.988 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.055 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.056 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000005b, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config'#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.207 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.208 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5734MB free_disk=73.38771438598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.208 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.209 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.225 2 DEBUG nova.network.neutron [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Updating instance_info_cache with network_info: [{"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.262 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Releasing lock "refresh_cache-cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.262 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Instance network_info: |[{"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.262 2 DEBUG oslo_concurrency.lockutils [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.263 2 DEBUG nova.network.neutron [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Refreshing network info cache for port e5fcdd20-03b8-47df-8838-af34f97abda2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.265 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Start _get_guest_xml network_info=[{"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.269 2 WARNING nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.393 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.394 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.400 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.401 2 DEBUG nova.virt.libvirt.host [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.401 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.402 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.402 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.402 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.402 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.403 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.403 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.403 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.403 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.403 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.404 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.404 2 DEBUG nova.virt.hardware [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.407 2 DEBUG nova.virt.libvirt.vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-636246405',display_name='tempest-ListServersNegativeTestJSON-server-636246405-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-636246405-1',id=92,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49c6a5f4c4c84d7ba686d98befbc981a',ramdisk_id='',reservation_id='r-pj0lxy21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1724341867',owner_user_name='tempest-ListServersNegativeTestJSON-1724341867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:20Z,user_data=None,user_id='836c60c20a0f48dd994c9d659781fc06',uuid=cf2c9cb7-1ce8-47ea-baa3-5aed0229e155,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.407 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converting VIF {"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.407 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.408 2 DEBUG nova.objects.instance [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lazy-loading 'pci_devices' on Instance uuid cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.426 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <uuid>cf2c9cb7-1ce8-47ea-baa3-5aed0229e155</uuid>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <name>instance-0000005c</name>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:name>tempest-ListServersNegativeTestJSON-server-636246405-1</nova:name>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:18:26</nova:creationTime>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:user uuid="836c60c20a0f48dd994c9d659781fc06">tempest-ListServersNegativeTestJSON-1724341867-project-member</nova:user>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:project uuid="49c6a5f4c4c84d7ba686d98befbc981a">tempest-ListServersNegativeTestJSON-1724341867</nova:project>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        <nova:port uuid="e5fcdd20-03b8-47df-8838-af34f97abda2">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <entry name="serial">cf2c9cb7-1ce8-47ea-baa3-5aed0229e155</entry>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <entry name="uuid">cf2c9cb7-1ce8-47ea-baa3-5aed0229e155</entry>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.config"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:60:da:29"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <target dev="tape5fcdd20-03"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/console.log" append="off"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:18:26 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:18:26 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:18:26 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:18:26 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.427 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Preparing to wait for external event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.427 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.427 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.427 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.428 2 DEBUG nova.virt.libvirt.vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-636246405',display_name='tempest-ListServersNegativeTestJSON-server-636246405-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-636246405-1',id=92,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49c6a5f4c4c84d7ba686d98befbc981a',ramdisk_id='',reservation_id='r-pj0lxy21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1724341867',owner_user_name='tempest-ListServersNegativeTestJSON-1724341867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:20Z,user_data=None,user_id='836c60c20a0f48dd994c9d659781fc06',uuid=cf2c9cb7-1ce8-47ea-baa3-5aed0229e155,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.428 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converting VIF {"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.429 2 DEBUG nova.network.os_vif_util [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.429 2 DEBUG os_vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.430 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.430 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.433 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5fcdd20-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.434 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5fcdd20-03, col_values=(('external_ids', {'iface-id': 'e5fcdd20-03b8-47df-8838-af34f97abda2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:da:29', 'vm-uuid': 'cf2c9cb7-1ce8-47ea-baa3-5aed0229e155'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 NetworkManager[51207]: <info>  [1759407506.4364] manager: (tape5fcdd20-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.448 2 INFO os_vif [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03')#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.457 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance fbfcbe40-57a4-4e81-a4a2-bc9c241749fc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.458 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.458 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.458 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.498 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.499 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.499 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] No VIF found with MAC fa:16:3e:60:da:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.499 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Using config drive#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.547 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.561 2 INFO nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Creating config drive at /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.567 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaq8b3p3a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.598 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.637 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.638 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.703 2 DEBUG oslo_concurrency.processutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaq8b3p3a" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:26 np0005466012 NetworkManager[51207]: <info>  [1759407506.7950] manager: (tapde78e7b6-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Oct  2 08:18:26 np0005466012 kernel: tapde78e7b6-f8: entered promiscuous mode
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:26Z|00322|binding|INFO|Claiming lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for this chassis.
Oct  2 08:18:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:26Z|00323|binding|INFO|de78e7b6-f8b9-40fb-bc85-0a8257f52c55: Claiming fa:16:3e:49:12:9b 10.100.0.12
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.826 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:12:9b 10.100.0.12'], port_security=['fa:16:3e:49:12:9b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fbfcbe40-57a4-4e81-a4a2-bc9c241749fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e895cece-6b67-405e-b05d-5b86ddbf8385', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa03c570c52a4c2a9445090389d03c6d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86713f8f-e4ad-44d5-8c6e-92e3b3c5f67c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42f687d5-26a0-4ae5-91cd-f49120fff442, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=de78e7b6-f8b9-40fb-bc85-0a8257f52c55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.829 103246 INFO neutron.agent.ovn.metadata.agent [-] Port de78e7b6-f8b9-40fb-bc85-0a8257f52c55 in datapath e895cece-6b67-405e-b05d-5b86ddbf8385 bound to our chassis#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.831 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e895cece-6b67-405e-b05d-5b86ddbf8385#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.848 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf42917-8af3-4ad1-be99-5a30dc58b582]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.849 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape895cece-61 in ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.854 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape895cece-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.854 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3acad1be-5cbe-4a44-84c5-95e6f9d1d286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.855 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[79c05508-3728-4e08-b5e1-ae08beda74cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 systemd-udevd[232581]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.868 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[59982036-94bb-46ed-8133-dbd1e377c958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 systemd-machined[152114]: New machine qemu-38-instance-0000005b.
Oct  2 08:18:26 np0005466012 NetworkManager[51207]: <info>  [1759407506.8826] device (tapde78e7b6-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:26 np0005466012 NetworkManager[51207]: <info>  [1759407506.8850] device (tapde78e7b6-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.892 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3dce2284-8d56-44f9-807d-cf2679d20f2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 systemd[1]: Started Virtual Machine qemu-38-instance-0000005b.
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:26Z|00324|binding|INFO|Setting lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 ovn-installed in OVS
Oct  2 08:18:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:26Z|00325|binding|INFO|Setting lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 up in Southbound
Oct  2 08:18:26 np0005466012 nova_compute[192063]: 2025-10-02 12:18:26.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.942 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2a7c8d-b56a-40ee-8767-186fde26f22e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.949 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08d11c58-dc27-489e-85b3-bd9ffc3f54fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 NetworkManager[51207]: <info>  [1759407506.9504] manager: (tape895cece-60): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.988 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7502b2d3-f346-4c12-8323-358bd1494d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:26.992 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5929a8-4407-4e6e-9608-1b98016b98c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 NetworkManager[51207]: <info>  [1759407507.0225] device (tape895cece-60): carrier: link connected
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.032 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[637bc27d-314f-4934-aa96-000a0e06550c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.060 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c1df79-34fb-49f5-9423-7116a149d9e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape895cece-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:96:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550063, 'reachable_time': 22372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232618, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.087 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f02104d8-570b-43dc-84f1-56356c7835d5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:9629'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550063, 'tstamp': 550063}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232619, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.119 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6407ede6-7b1b-4c9b-a4cc-20444a3fb557]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape895cece-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:96:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550063, 'reachable_time': 22372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232620, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.170 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0c250e7a-1093-443f-ab74-64305c28971c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.268 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8be2b5-d511-4c0e-bcd9-624204c1dc5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.270 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape895cece-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.270 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.270 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape895cece-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:27 np0005466012 NetworkManager[51207]: <info>  [1759407507.2732] manager: (tape895cece-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Oct  2 08:18:27 np0005466012 kernel: tape895cece-60: entered promiscuous mode
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.280 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape895cece-60, col_values=(('external_ids', {'iface-id': '893d58a9-c253-4923-8cf4-03927d247550'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:27Z|00326|binding|INFO|Releasing lport 893d58a9-c253-4923-8cf4-03927d247550 from this chassis (sb_readonly=0)
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.311 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e895cece-6b67-405e-b05d-5b86ddbf8385.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e895cece-6b67-405e-b05d-5b86ddbf8385.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.313 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6115896a-d270-4bd9-8672-524ff6d0b82e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.314 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-e895cece-6b67-405e-b05d-5b86ddbf8385
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/e895cece-6b67-405e-b05d-5b86ddbf8385.pid.haproxy
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID e895cece-6b67-405e-b05d-5b86ddbf8385
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.315 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'env', 'PROCESS_TAG=haproxy-e895cece-6b67-405e-b05d-5b86ddbf8385', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e895cece-6b67-405e-b05d-5b86ddbf8385.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:27 np0005466012 podman[232664]: 2025-10-02 12:18:27.735289251 +0000 UTC m=+0.052862059 container create 686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.768 2 INFO nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Creating config drive at /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.config#033[00m
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.775 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplimwv0u3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:27 np0005466012 systemd[1]: Started libpod-conmon-686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba.scope.
Oct  2 08:18:27 np0005466012 podman[232664]: 2025-10-02 12:18:27.710672484 +0000 UTC m=+0.028245322 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:27 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:18:27 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5d44a44f0f2df1f0e5ceb9c237f2913bbe1f20a7060057840f0c424f7ad0591/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:27 np0005466012 podman[232664]: 2025-10-02 12:18:27.849937584 +0000 UTC m=+0.167510392 container init 686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:18:27 np0005466012 podman[232664]: 2025-10-02 12:18:27.855111922 +0000 UTC m=+0.172684730 container start 686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:18:27 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [NOTICE]   (232686) : New worker (232688) forked
Oct  2 08:18:27 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [NOTICE]   (232686) : Loading success.
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.901 2 DEBUG oslo_concurrency.processutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplimwv0u3" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:27 np0005466012 kernel: tape5fcdd20-03: entered promiscuous mode
Oct  2 08:18:27 np0005466012 NetworkManager[51207]: <info>  [1759407507.9640] manager: (tape5fcdd20-03): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:27 np0005466012 systemd-udevd[232602]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:27Z|00327|binding|INFO|Claiming lport e5fcdd20-03b8-47df-8838-af34f97abda2 for this chassis.
Oct  2 08:18:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:27Z|00328|binding|INFO|e5fcdd20-03b8-47df-8838-af34f97abda2: Claiming fa:16:3e:60:da:29 10.100.0.6
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.982 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:da:29 10.100.0.6'], port_security=['fa:16:3e:60:da:29 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cf2c9cb7-1ce8-47ea-baa3-5aed0229e155', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49c6a5f4c4c84d7ba686d98befbc981a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55283a5f-31d5-4a4d-bc9f-4b8e3fc9f6b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cc3415d-eee4-499b-a06c-93196fe04768, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=e5fcdd20-03b8-47df-8838-af34f97abda2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.983 103246 INFO neutron.agent.ovn.metadata.agent [-] Port e5fcdd20-03b8-47df-8838-af34f97abda2 in datapath 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 bound to our chassis#033[00m
Oct  2 08:18:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:27.984 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92#033[00m
Oct  2 08:18:27 np0005466012 NetworkManager[51207]: <info>  [1759407507.9806] device (tape5fcdd20-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:27 np0005466012 NetworkManager[51207]: <info>  [1759407507.9876] device (tape5fcdd20-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:27 np0005466012 systemd-machined[152114]: New machine qemu-39-instance-0000005c.
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.997 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407507.9968057, fbfcbe40-57a4-4e81-a4a2-bc9c241749fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:27 np0005466012 nova_compute[192063]: 2025-10-02 12:18:27.998 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.002 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cb367138-9ba3-49c4-8a65-6615978d4e95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.003 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5b886deb-a1 in ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.006 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5b886deb-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.006 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2af8a8d5-18a4-4061-a6eb-fd93a98e1137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.007 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f7caf7b9-48b7-4499-a087-708208c893be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.018 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a67b18-bae2-4237-b827-8f7fd1c4c967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005466012 systemd[1]: Started Virtual Machine qemu-39-instance-0000005c.
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.027 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:28Z|00329|binding|INFO|Setting lport e5fcdd20-03b8-47df-8838-af34f97abda2 ovn-installed in OVS
Oct  2 08:18:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:28Z|00330|binding|INFO|Setting lport e5fcdd20-03b8-47df-8838-af34f97abda2 up in Southbound
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.031 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407507.9969358, fbfcbe40-57a4-4e81-a4a2-bc9c241749fc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.032 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.032 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b4033613-e36a-4493-b52a-e8925b6b7500]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.058 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.058 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[15ed7ead-0f2a-4518-b9c8-de4aca384906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.064 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.063 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[94d47e97-8b5c-4cae-8ac8-6e88285f318e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 NetworkManager[51207]: <info>  [1759407508.0685] manager: (tap5b886deb-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.086 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.087 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.087 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.087 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.087 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Processing event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.088 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.088 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.088 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.088 2 DEBUG oslo_concurrency.lockutils [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.088 2 DEBUG nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.089 2 WARNING nova.compute.manager [req-2822cb03-7e74-47af-832d-f193cc2ece88 req-343c8372-e3da-48fd-a43a-9e835098b4a0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received unexpected event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.090 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.090 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.095 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[93641ab9-68fd-4af4-9de1-d168ea3c67ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.097 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407508.0971382, fbfcbe40-57a4-4e81-a4a2-bc9c241749fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.097 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.098 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9a38bc3c-14e0-44d8-8bb4-b456f63b79da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.100 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.103 2 INFO nova.virt.libvirt.driver [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance spawned successfully.#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.104 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:18:28 np0005466012 NetworkManager[51207]: <info>  [1759407508.1197] device (tap5b886deb-a0): carrier: link connected
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.124 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f4282d3a-971d-4c35-8ec5-e55ad1de2adc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.128 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.135 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.138 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.139 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.139 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.140 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.140 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.141 2 DEBUG nova.virt.libvirt.driver [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.140 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[25766006-8bec-4e8d-8fc1-902b0e84f067]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b886deb-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:39:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550173, 'reachable_time': 29133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232729, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.155 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ea580bcd-9685-409e-9ce8-6d127a519072]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:39f1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550173, 'tstamp': 550173}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232730, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.171 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[15742574-1f7f-4988-b64d-d6acbd2d2ecf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b886deb-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:39:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550173, 'reachable_time': 29133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232731, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.183 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.201 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a740fb59-df15-472c-a460-90a20edd27db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.256 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d345429e-51ca-498b-829e-59d259724649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.258 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b886deb-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.258 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.259 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b886deb-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:28 np0005466012 NetworkManager[51207]: <info>  [1759407508.2645] manager: (tap5b886deb-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Oct  2 08:18:28 np0005466012 kernel: tap5b886deb-a0: entered promiscuous mode
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.266 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5b886deb-a0, col_values=(('external_ids', {'iface-id': '444f6470-b3a4-44de-9f71-88b373acc28c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:28Z|00331|binding|INFO|Releasing lport 444f6470-b3a4-44de-9f71-88b373acc28c from this chassis (sb_readonly=0)
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.269 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.270 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d7122de4-ae24-472a-afd9-fc2033caffe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.271 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.pid.haproxy
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:28.271 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'env', 'PROCESS_TAG=haproxy-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5b886deb-ac8b-4d5e-a6d4-b19699c6ae92.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.318 2 INFO nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Took 8.57 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.319 2 DEBUG nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.426 2 INFO nova.compute.manager [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Took 9.38 seconds to build instance.#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.459 2 DEBUG oslo_concurrency.lockutils [None req-42f3ebee-d862-4148-b776-775209f60b1c 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.643 2 DEBUG nova.network.neutron [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Updated VIF entry in instance network info cache for port de78e7b6-f8b9-40fb-bc85-0a8257f52c55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.644 2 DEBUG nova.network.neutron [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Updating instance_info_cache with network_info: [{"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:28 np0005466012 podman[232768]: 2025-10-02 12:18:28.66416235 +0000 UTC m=+0.055942418 container create 68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.684 2 DEBUG oslo_concurrency.lockutils [req-e24f2cee-994b-41ce-b1b2-45098c603b38 req-95f2263c-b0f0-42ac-b5ea-ae2532f10547 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:28 np0005466012 systemd[1]: Started libpod-conmon-68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611.scope.
Oct  2 08:18:28 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:18:28 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/751deef98dbbeeaed27916f22007874b8bee101e95a0f636f2108305fda6a4d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:28 np0005466012 podman[232768]: 2025-10-02 12:18:28.632962124 +0000 UTC m=+0.024742222 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:28 np0005466012 podman[232768]: 2025-10-02 12:18:28.729574779 +0000 UTC m=+0.121354857 container init 68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:18:28 np0005466012 podman[232768]: 2025-10-02 12:18:28.736153388 +0000 UTC m=+0.127933456 container start 68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:18:28 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [NOTICE]   (232815) : New worker (232828) forked
Oct  2 08:18:28 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [NOTICE]   (232815) : Loading success.
Oct  2 08:18:28 np0005466012 podman[232781]: 2025-10-02 12:18:28.768295641 +0000 UTC m=+0.069794545 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:18:28 np0005466012 podman[232785]: 2025-10-02 12:18:28.788224574 +0000 UTC m=+0.087781383 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6)
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.826 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407508.8257124, cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.826 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.854 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.857 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407508.8275287, cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.857 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.891 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.893 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:28 np0005466012 nova_compute[192063]: 2025-10-02 12:18:28.911 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:29 np0005466012 nova_compute[192063]: 2025-10-02 12:18:29.224 2 DEBUG nova.network.neutron [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Updated VIF entry in instance network info cache for port e5fcdd20-03b8-47df-8838-af34f97abda2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:18:29 np0005466012 nova_compute[192063]: 2025-10-02 12:18:29.225 2 DEBUG nova.network.neutron [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Updating instance_info_cache with network_info: [{"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:29 np0005466012 nova_compute[192063]: 2025-10-02 12:18:29.279 2 DEBUG oslo_concurrency.lockutils [req-8d31a323-560d-4ef6-a3d9-d98725c7f878 req-5803fab9-0506-4642-8809-fd9036cacc3c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.261 2 DEBUG nova.compute.manager [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.262 2 DEBUG oslo_concurrency.lockutils [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.262 2 DEBUG oslo_concurrency.lockutils [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.262 2 DEBUG oslo_concurrency.lockutils [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.262 2 DEBUG nova.compute.manager [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Processing event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.263 2 DEBUG nova.compute.manager [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.263 2 DEBUG oslo_concurrency.lockutils [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.263 2 DEBUG oslo_concurrency.lockutils [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.263 2 DEBUG oslo_concurrency.lockutils [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.263 2 DEBUG nova.compute.manager [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] No waiting events found dispatching network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.263 2 WARNING nova.compute.manager [req-106c5a64-3f20-433a-b1ba-079d0661853f req-9f5c2829-5676-4f5f-8d45-63bb2b30650b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received unexpected event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.264 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.279 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.280 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407510.2790291, cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.280 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.286 2 INFO nova.virt.libvirt.driver [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Instance spawned successfully.#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.286 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.339 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.346 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.349 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.350 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.350 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.350 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.351 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.351 2 DEBUG nova.virt.libvirt.driver [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.413 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.497 2 INFO nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Took 9.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.497 2 DEBUG nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.782 2 INFO nova.compute.manager [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Took 11.35 seconds to build instance.#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.810 2 DEBUG oslo_concurrency.lockutils [None req-8e065fde-4167-4cc4-96d0-33bdccc52d90 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.812 2 INFO nova.compute.manager [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Rescuing#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.812 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.813 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquired lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:18:30 np0005466012 nova_compute[192063]: 2025-10-02 12:18:30.813 2 DEBUG nova.network.neutron [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:18:31 np0005466012 nova_compute[192063]: 2025-10-02 12:18:31.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:31 np0005466012 nova_compute[192063]: 2025-10-02 12:18:31.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:32 np0005466012 nova_compute[192063]: 2025-10-02 12:18:32.093 2 DEBUG nova.network.neutron [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Updating instance_info_cache with network_info: [{"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:32 np0005466012 nova_compute[192063]: 2025-10-02 12:18:32.120 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Releasing lock "refresh_cache-fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:18:32 np0005466012 nova_compute[192063]: 2025-10-02 12:18:32.683 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:18:33 np0005466012 podman[232837]: 2025-10-02 12:18:33.134420105 +0000 UTC m=+0.051529350 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:18:33 np0005466012 podman[232838]: 2025-10-02 12:18:33.160474493 +0000 UTC m=+0.067194401 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.568 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.568 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.569 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.569 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.569 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.579 2 INFO nova.compute.manager [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Terminating instance#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.590 2 DEBUG nova.compute.manager [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:18:33 np0005466012 kernel: tape5fcdd20-03 (unregistering): left promiscuous mode
Oct  2 08:18:33 np0005466012 NetworkManager[51207]: <info>  [1759407513.6100] device (tape5fcdd20-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:33Z|00332|binding|INFO|Releasing lport e5fcdd20-03b8-47df-8838-af34f97abda2 from this chassis (sb_readonly=0)
Oct  2 08:18:33 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:33Z|00333|binding|INFO|Setting lport e5fcdd20-03b8-47df-8838-af34f97abda2 down in Southbound
Oct  2 08:18:33 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:33Z|00334|binding|INFO|Removing iface tape5fcdd20-03 ovn-installed in OVS
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.629 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:da:29 10.100.0.6'], port_security=['fa:16:3e:60:da:29 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cf2c9cb7-1ce8-47ea-baa3-5aed0229e155', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49c6a5f4c4c84d7ba686d98befbc981a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55283a5f-31d5-4a4d-bc9f-4b8e3fc9f6b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cc3415d-eee4-499b-a06c-93196fe04768, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=e5fcdd20-03b8-47df-8838-af34f97abda2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.630 103246 INFO neutron.agent.ovn.metadata.agent [-] Port e5fcdd20-03b8-47df-8838-af34f97abda2 in datapath 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 unbound from our chassis#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.631 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.632 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[97b9cba5-5485-4161-8a72-94d3d97e1fd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.632 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 namespace which is not needed anymore#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466012 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Oct  2 08:18:33 np0005466012 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005c.scope: Consumed 4.025s CPU time.
Oct  2 08:18:33 np0005466012 systemd-machined[152114]: Machine qemu-39-instance-0000005c terminated.
Oct  2 08:18:33 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [NOTICE]   (232815) : haproxy version is 2.8.14-c23fe91
Oct  2 08:18:33 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [NOTICE]   (232815) : path to executable is /usr/sbin/haproxy
Oct  2 08:18:33 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [WARNING]  (232815) : Exiting Master process...
Oct  2 08:18:33 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [ALERT]    (232815) : Current worker (232828) exited with code 143 (Terminated)
Oct  2 08:18:33 np0005466012 neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92[232786]: [WARNING]  (232815) : All workers exited. Exiting... (0)
Oct  2 08:18:33 np0005466012 systemd[1]: libpod-68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611.scope: Deactivated successfully.
Oct  2 08:18:33 np0005466012 podman[232903]: 2025-10-02 12:18:33.756087821 +0000 UTC m=+0.046733284 container died 68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:18:33 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611-userdata-shm.mount: Deactivated successfully.
Oct  2 08:18:33 np0005466012 systemd[1]: var-lib-containers-storage-overlay-751deef98dbbeeaed27916f22007874b8bee101e95a0f636f2108305fda6a4d2-merged.mount: Deactivated successfully.
Oct  2 08:18:33 np0005466012 podman[232903]: 2025-10-02 12:18:33.795348299 +0000 UTC m=+0.085993762 container cleanup 68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:18:33 np0005466012 systemd[1]: libpod-conmon-68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611.scope: Deactivated successfully.
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.860 2 INFO nova.virt.libvirt.driver [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Instance destroyed successfully.#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.864 2 DEBUG nova.objects.instance [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lazy-loading 'resources' on Instance uuid cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:33 np0005466012 podman[232935]: 2025-10-02 12:18:33.879888457 +0000 UTC m=+0.054835046 container remove 68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.887 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6473062d-6cf8-4d89-8018-551e1e68f523]: (4, ('Thu Oct  2 12:18:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 (68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611)\n68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611\nThu Oct  2 12:18:33 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 (68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611)\n68bb5a542efc3eb21a3caabf6e828940f34ae5fa092db88e8ff7efc839288611\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.888 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1636daa8-5c17-4f1b-848f-dad693956801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.889 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b886deb-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.890 2 DEBUG nova.virt.libvirt.vif [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-636246405',display_name='tempest-ListServersNegativeTestJSON-server-636246405-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-636246405-1',id=92,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='49c6a5f4c4c84d7ba686d98befbc981a',ramdisk_id='',reservation_id='r-pj0lxy21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1724341867',owner_user_name='tempest-ListServersNegativeTestJSON-1724341867-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:18:30Z,user_data=None,user_id='836c60c20a0f48dd994c9d659781fc06',uuid=cf2c9cb7-1ce8-47ea-baa3-5aed0229e155,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:18:33 np0005466012 kernel: tap5b886deb-a0: left promiscuous mode
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.891 2 DEBUG nova.network.os_vif_util [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converting VIF {"id": "e5fcdd20-03b8-47df-8838-af34f97abda2", "address": "fa:16:3e:60:da:29", "network": {"id": "5b886deb-ac8b-4d5e-a6d4-b19699c6ae92", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1740420896-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49c6a5f4c4c84d7ba686d98befbc981a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5fcdd20-03", "ovs_interfaceid": "e5fcdd20-03b8-47df-8838-af34f97abda2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.894 2 DEBUG nova.network.os_vif_util [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.895 2 DEBUG os_vif [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.897 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5fcdd20-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.913 2 INFO os_vif [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:da:29,bridge_name='br-int',has_traffic_filtering=True,id=e5fcdd20-03b8-47df-8838-af34f97abda2,network=Network(5b886deb-ac8b-4d5e-a6d4-b19699c6ae92),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5fcdd20-03')#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.912 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[029f2431-4b56-4f6e-847c-928fedea61aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.914 2 INFO nova.virt.libvirt.driver [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Deleting instance files /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155_del#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.915 2 INFO nova.virt.libvirt.driver [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Deletion of /var/lib/nova/instances/cf2c9cb7-1ce8-47ea-baa3-5aed0229e155_del complete#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.942 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[843503e8-53b7-4724-811d-d6fbc594d2b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.943 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[61ba17e2-d647-4026-86d4-37689548abca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.944 2 DEBUG nova.compute.manager [req-f42a8e50-caa9-44aa-bed0-ff11c836fb3b req-160c0a14-7a47-42e9-ad40-97cfa84354ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-vif-unplugged-e5fcdd20-03b8-47df-8838-af34f97abda2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.944 2 DEBUG oslo_concurrency.lockutils [req-f42a8e50-caa9-44aa-bed0-ff11c836fb3b req-160c0a14-7a47-42e9-ad40-97cfa84354ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.945 2 DEBUG oslo_concurrency.lockutils [req-f42a8e50-caa9-44aa-bed0-ff11c836fb3b req-160c0a14-7a47-42e9-ad40-97cfa84354ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.945 2 DEBUG oslo_concurrency.lockutils [req-f42a8e50-caa9-44aa-bed0-ff11c836fb3b req-160c0a14-7a47-42e9-ad40-97cfa84354ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.945 2 DEBUG nova.compute.manager [req-f42a8e50-caa9-44aa-bed0-ff11c836fb3b req-160c0a14-7a47-42e9-ad40-97cfa84354ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] No waiting events found dispatching network-vif-unplugged-e5fcdd20-03b8-47df-8838-af34f97abda2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.945 2 DEBUG nova.compute.manager [req-f42a8e50-caa9-44aa-bed0-ff11c836fb3b req-160c0a14-7a47-42e9-ad40-97cfa84354ae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-vif-unplugged-e5fcdd20-03b8-47df-8838-af34f97abda2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.959 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8789f47f-836b-4207-987b-3f1f95cdf000]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550166, 'reachable_time': 43961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232965, 'error': None, 'target': 'ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 systemd[1]: run-netns-ovnmeta\x2d5b886deb\x2dac8b\x2d4d5e\x2da6d4\x2db19699c6ae92.mount: Deactivated successfully.
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.964 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5b886deb-ac8b-4d5e-a6d4-b19699c6ae92 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:18:33 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:33.965 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c5f9e9-bd37-4c2f-b1c7-71e7380ef123]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.994 2 INFO nova.compute.manager [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.995 2 DEBUG oslo.service.loopingcall [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.995 2 DEBUG nova.compute.manager [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:18:33 np0005466012 nova_compute[192063]: 2025-10-02 12:18:33.995 2 DEBUG nova.network.neutron [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.024 2 DEBUG nova.network.neutron [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.043 2 INFO nova.compute.manager [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Took 1.05 seconds to deallocate network for instance.#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.149 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.150 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.222 2 DEBUG nova.compute.provider_tree [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.239 2 DEBUG nova.scheduler.client.report [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.265 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.296 2 INFO nova.scheduler.client.report [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Deleted allocations for instance cf2c9cb7-1ce8-47ea-baa3-5aed0229e155#033[00m
Oct  2 08:18:35 np0005466012 nova_compute[192063]: 2025-10-02 12:18:35.392 2 DEBUG oslo_concurrency.lockutils [None req-8162c0bb-b9ac-423b-8bc6-3914cac93f67 836c60c20a0f48dd994c9d659781fc06 49c6a5f4c4c84d7ba686d98befbc981a - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.025 2 DEBUG nova.compute.manager [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.026 2 DEBUG oslo_concurrency.lockutils [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.026 2 DEBUG oslo_concurrency.lockutils [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.027 2 DEBUG oslo_concurrency.lockutils [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "cf2c9cb7-1ce8-47ea-baa3-5aed0229e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.027 2 DEBUG nova.compute.manager [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] No waiting events found dispatching network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.028 2 WARNING nova.compute.manager [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received unexpected event network-vif-plugged-e5fcdd20-03b8-47df-8838-af34f97abda2 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.028 2 DEBUG nova.compute.manager [req-93c65feb-acc0-4da9-9d49-4e1e9afb8c81 req-468d3235-adbb-4c06-8489-09ce2bfd031f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Received event network-vif-deleted-e5fcdd20-03b8-47df-8838-af34f97abda2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.639 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.639 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:18:36 np0005466012 nova_compute[192063]: 2025-10-02 12:18:36.670 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:18:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:37.810 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:37.811 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:18:37 np0005466012 nova_compute[192063]: 2025-10-02 12:18:37.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.510 2 DEBUG nova.compute.manager [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.622 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.622 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.692 2 DEBUG nova.objects.instance [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_requests' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.704 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.705 2 INFO nova.compute.claims [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.705 2 DEBUG nova.objects.instance [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'resources' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.720 2 DEBUG nova.objects.instance [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.770 2 INFO nova.compute.resource_tracker [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating resource usage from migration 9ac81be8-ebe7-4fa1-8a2c-22673a5e20bf#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.771 2 DEBUG nova.compute.resource_tracker [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Starting to track incoming migration 9ac81be8-ebe7-4fa1-8a2c-22673a5e20bf with flavor 9949d9da-6314-4ede-8797-6f2f0a6a64fc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.855 2 DEBUG nova.compute.provider_tree [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.868 2 DEBUG nova.scheduler.client.report [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.885 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.885 2 INFO nova.compute.manager [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Migrating#033[00m
Oct  2 08:18:38 np0005466012 nova_compute[192063]: 2025-10-02 12:18:38.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:41 np0005466012 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:18:41 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:18:41 np0005466012 systemd-logind[827]: New session 33 of user nova.
Oct  2 08:18:41 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:18:41 np0005466012 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:18:41 np0005466012 systemd[232993]: Queued start job for default target Main User Target.
Oct  2 08:18:41 np0005466012 systemd[232993]: Created slice User Application Slice.
Oct  2 08:18:41 np0005466012 systemd[232993]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:18:41 np0005466012 systemd[232993]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:18:41 np0005466012 systemd[232993]: Reached target Paths.
Oct  2 08:18:41 np0005466012 systemd[232993]: Reached target Timers.
Oct  2 08:18:41 np0005466012 systemd[232993]: Starting D-Bus User Message Bus Socket...
Oct  2 08:18:41 np0005466012 systemd[232993]: Starting Create User's Volatile Files and Directories...
Oct  2 08:18:41 np0005466012 systemd[232993]: Finished Create User's Volatile Files and Directories.
Oct  2 08:18:41 np0005466012 systemd[232993]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:18:41 np0005466012 systemd[232993]: Reached target Sockets.
Oct  2 08:18:41 np0005466012 systemd[232993]: Reached target Basic System.
Oct  2 08:18:41 np0005466012 systemd[232993]: Reached target Main User Target.
Oct  2 08:18:41 np0005466012 systemd[232993]: Startup finished in 143ms.
Oct  2 08:18:41 np0005466012 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:18:41 np0005466012 systemd[1]: Started Session 33 of User nova.
Oct  2 08:18:41 np0005466012 systemd[1]: session-33.scope: Deactivated successfully.
Oct  2 08:18:41 np0005466012 systemd-logind[827]: Session 33 logged out. Waiting for processes to exit.
Oct  2 08:18:41 np0005466012 systemd-logind[827]: Removed session 33.
Oct  2 08:18:41 np0005466012 nova_compute[192063]: 2025-10-02 12:18:41.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:41 np0005466012 systemd-logind[827]: New session 35 of user nova.
Oct  2 08:18:41 np0005466012 systemd[1]: Started Session 35 of User nova.
Oct  2 08:18:41 np0005466012 systemd[1]: session-35.scope: Deactivated successfully.
Oct  2 08:18:41 np0005466012 systemd-logind[827]: Session 35 logged out. Waiting for processes to exit.
Oct  2 08:18:41 np0005466012 systemd-logind[827]: Removed session 35.
Oct  2 08:18:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:41Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:12:9b 10.100.0.12
Oct  2 08:18:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:41Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:12:9b 10.100.0.12
Oct  2 08:18:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:41.813 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:42 np0005466012 nova_compute[192063]: 2025-10-02 12:18:42.725 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:18:43 np0005466012 nova_compute[192063]: 2025-10-02 12:18:43.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:44Z|00335|binding|INFO|Releasing lport 893d58a9-c253-4923-8cf4-03927d247550 from this chassis (sb_readonly=0)
Oct  2 08:18:44 np0005466012 nova_compute[192063]: 2025-10-02 12:18:44.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:44 np0005466012 kernel: tapde78e7b6-f8 (unregistering): left promiscuous mode
Oct  2 08:18:44 np0005466012 NetworkManager[51207]: <info>  [1759407524.8783] device (tapde78e7b6-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:18:44 np0005466012 nova_compute[192063]: 2025-10-02 12:18:44.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:44Z|00336|binding|INFO|Releasing lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 from this chassis (sb_readonly=0)
Oct  2 08:18:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:44Z|00337|binding|INFO|Setting lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 down in Southbound
Oct  2 08:18:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:44Z|00338|binding|INFO|Removing iface tapde78e7b6-f8 ovn-installed in OVS
Oct  2 08:18:44 np0005466012 nova_compute[192063]: 2025-10-02 12:18:44.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:44.903 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:12:9b 10.100.0.12'], port_security=['fa:16:3e:49:12:9b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fbfcbe40-57a4-4e81-a4a2-bc9c241749fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e895cece-6b67-405e-b05d-5b86ddbf8385', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa03c570c52a4c2a9445090389d03c6d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86713f8f-e4ad-44d5-8c6e-92e3b3c5f67c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42f687d5-26a0-4ae5-91cd-f49120fff442, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=de78e7b6-f8b9-40fb-bc85-0a8257f52c55) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:44.904 103246 INFO neutron.agent.ovn.metadata.agent [-] Port de78e7b6-f8b9-40fb-bc85-0a8257f52c55 in datapath e895cece-6b67-405e-b05d-5b86ddbf8385 unbound from our chassis#033[00m
Oct  2 08:18:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:44.905 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e895cece-6b67-405e-b05d-5b86ddbf8385, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:18:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:44.907 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3e368565-30dc-4803-87da-d499602f8279]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:44.908 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 namespace which is not needed anymore#033[00m
Oct  2 08:18:44 np0005466012 nova_compute[192063]: 2025-10-02 12:18:44.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:44 np0005466012 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Oct  2 08:18:44 np0005466012 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005b.scope: Consumed 13.214s CPU time.
Oct  2 08:18:44 np0005466012 systemd-machined[152114]: Machine qemu-38-instance-0000005b terminated.
Oct  2 08:18:45 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [NOTICE]   (232686) : haproxy version is 2.8.14-c23fe91
Oct  2 08:18:45 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [NOTICE]   (232686) : path to executable is /usr/sbin/haproxy
Oct  2 08:18:45 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [WARNING]  (232686) : Exiting Master process...
Oct  2 08:18:45 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [ALERT]    (232686) : Current worker (232688) exited with code 143 (Terminated)
Oct  2 08:18:45 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[232680]: [WARNING]  (232686) : All workers exited. Exiting... (0)
Oct  2 08:18:45 np0005466012 systemd[1]: libpod-686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba.scope: Deactivated successfully.
Oct  2 08:18:45 np0005466012 podman[233041]: 2025-10-02 12:18:45.067267531 +0000 UTC m=+0.049738090 container died 686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.070 2 DEBUG nova.compute.manager [req-ec8bd399-fa52-4930-9008-305a73fc034c req-715310e1-0f6b-4a30-bee9-35643e411672 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-unplugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.070 2 DEBUG oslo_concurrency.lockutils [req-ec8bd399-fa52-4930-9008-305a73fc034c req-715310e1-0f6b-4a30-bee9-35643e411672 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.071 2 DEBUG oslo_concurrency.lockutils [req-ec8bd399-fa52-4930-9008-305a73fc034c req-715310e1-0f6b-4a30-bee9-35643e411672 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.071 2 DEBUG oslo_concurrency.lockutils [req-ec8bd399-fa52-4930-9008-305a73fc034c req-715310e1-0f6b-4a30-bee9-35643e411672 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.071 2 DEBUG nova.compute.manager [req-ec8bd399-fa52-4930-9008-305a73fc034c req-715310e1-0f6b-4a30-bee9-35643e411672 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-unplugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.071 2 WARNING nova.compute.manager [req-ec8bd399-fa52-4930-9008-305a73fc034c req-715310e1-0f6b-4a30-bee9-35643e411672 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received unexpected event network-vif-unplugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:18:45 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba-userdata-shm.mount: Deactivated successfully.
Oct  2 08:18:45 np0005466012 systemd[1]: var-lib-containers-storage-overlay-f5d44a44f0f2df1f0e5ceb9c237f2913bbe1f20a7060057840f0c424f7ad0591-merged.mount: Deactivated successfully.
Oct  2 08:18:45 np0005466012 podman[233041]: 2025-10-02 12:18:45.105530999 +0000 UTC m=+0.088001558 container cleanup 686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:18:45 np0005466012 systemd[1]: libpod-conmon-686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba.scope: Deactivated successfully.
Oct  2 08:18:45 np0005466012 podman[233074]: 2025-10-02 12:18:45.16960128 +0000 UTC m=+0.042103531 container remove 686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.175 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[46463966-4af8-4e25-95e3-cb999f8ed85a]: (4, ('Thu Oct  2 12:18:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 (686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba)\n686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba\nThu Oct  2 12:18:45 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 (686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba)\n686bb8a8c34ee6e64941da6be25611ca4dd0e49650800c834636c05d408036ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.177 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[320c53a5-5285-4d42-ae88-85a98cd53138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.177 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape895cece-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:45 np0005466012 kernel: tape895cece-60: left promiscuous mode
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.201 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0f93f8c9-aa06-4bcb-b24c-3c73190eebad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.230 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c60fd210-2e8e-41ab-91b3-25f3c5ecdccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.231 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1156c1-5d24-4b58-9771-f06b26f98116]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.256 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[da2e7a01-a18d-4eae-81be-1637f02f95f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550054, 'reachable_time': 32632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233107, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 systemd[1]: run-netns-ovnmeta\x2de895cece\x2d6b67\x2d405e\x2db05d\x2d5b86ddbf8385.mount: Deactivated successfully.
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.259 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:18:45 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:45.260 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b65f6f-2960-4cb9-a295-7003efc3e2d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.741 2 INFO nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.746 2 INFO nova.virt.libvirt.driver [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance destroyed successfully.#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.747 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'numa_topology' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.771 2 INFO nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Attempting rescue#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.772 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.775 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.776 2 INFO nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Creating image(s)#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.776 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.776 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.777 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.777 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'trusted_certs' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.800 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.800 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.810 2 DEBUG oslo_concurrency.processutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.868 2 DEBUG oslo_concurrency.processutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.869 2 DEBUG oslo_concurrency.processutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.905 2 DEBUG oslo_concurrency.processutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.rescue" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.906 2 DEBUG oslo_concurrency.lockutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.907 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'migration_context' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.920 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.921 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Start _get_guest_xml network_info=[{"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "vif_mac": "fa:16:3e:49:12:9b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.922 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'resources' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.938 2 WARNING nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.943 2 DEBUG nova.virt.libvirt.host [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.943 2 DEBUG nova.virt.libvirt.host [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.947 2 DEBUG nova.virt.libvirt.host [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.947 2 DEBUG nova.virt.libvirt.host [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.948 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.949 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.949 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.949 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.950 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.950 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.950 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.950 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.951 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.951 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.951 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.951 2 DEBUG nova.virt.hardware [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.952 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'vcpu_model' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.969 2 DEBUG nova.virt.libvirt.vif [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1845813479',display_name='tempest-ServerRescueNegativeTestJSON-server-1845813479',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1845813479',id=91,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa03c570c52a4c2a9445090389d03c6d',ramdisk_id='',reservation_id='r-n02x5bvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1968496116',owner_user_name='tempest-ServerRescueNegativeTestJSON-1968496116-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:18:28Z,user_data=None,user_id='8c91fa3e559044609ddabc81368d7546',uuid=fbfcbe40-57a4-4e81-a4a2-bc9c241749fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "vif_mac": "fa:16:3e:49:12:9b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.970 2 DEBUG nova.network.os_vif_util [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converting VIF {"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "vif_mac": "fa:16:3e:49:12:9b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.971 2 DEBUG nova.network.os_vif_util [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.971 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'pci_devices' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:45 np0005466012 nova_compute[192063]: 2025-10-02 12:18:45.986 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <uuid>fbfcbe40-57a4-4e81-a4a2-bc9c241749fc</uuid>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <name>instance-0000005b</name>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1845813479</nova:name>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:18:45</nova:creationTime>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:user uuid="8c91fa3e559044609ddabc81368d7546">tempest-ServerRescueNegativeTestJSON-1968496116-project-member</nova:user>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:project uuid="fa03c570c52a4c2a9445090389d03c6d">tempest-ServerRescueNegativeTestJSON-1968496116</nova:project>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        <nova:port uuid="de78e7b6-f8b9-40fb-bc85-0a8257f52c55">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <entry name="serial">fbfcbe40-57a4-4e81-a4a2-bc9c241749fc</entry>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <entry name="uuid">fbfcbe40-57a4-4e81-a4a2-bc9c241749fc</entry>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.rescue"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <target dev="vdb" bus="virtio"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config.rescue"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:49:12:9b"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <target dev="tapde78e7b6-f8"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/console.log" append="off"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:18:45 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:18:45 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:18:45 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:18:45 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.002 2 INFO nova.virt.libvirt.driver [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance destroyed successfully.#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.074 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.074 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.075 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.075 2 DEBUG nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] No VIF found with MAC fa:16:3e:49:12:9b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.076 2 INFO nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Using config drive#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.089 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'ec2_ids' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.118 2 DEBUG nova.objects.instance [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'keypairs' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:46 np0005466012 nova_compute[192063]: 2025-10-02 12:18:46.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.150 2 INFO nova.virt.libvirt.driver [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Creating config drive at /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config.rescue#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.156 2 DEBUG oslo_concurrency.processutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwefm2xn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.193 2 DEBUG nova.compute.manager [req-6bf6a0af-ac97-49c1-85b6-578bce70686c req-5537dd27-907c-444b-ac20-592e559ead38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.194 2 DEBUG oslo_concurrency.lockutils [req-6bf6a0af-ac97-49c1-85b6-578bce70686c req-5537dd27-907c-444b-ac20-592e559ead38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.194 2 DEBUG oslo_concurrency.lockutils [req-6bf6a0af-ac97-49c1-85b6-578bce70686c req-5537dd27-907c-444b-ac20-592e559ead38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.195 2 DEBUG oslo_concurrency.lockutils [req-6bf6a0af-ac97-49c1-85b6-578bce70686c req-5537dd27-907c-444b-ac20-592e559ead38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.195 2 DEBUG nova.compute.manager [req-6bf6a0af-ac97-49c1-85b6-578bce70686c req-5537dd27-907c-444b-ac20-592e559ead38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.195 2 WARNING nova.compute.manager [req-6bf6a0af-ac97-49c1-85b6-578bce70686c req-5537dd27-907c-444b-ac20-592e559ead38 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received unexpected event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.284 2 DEBUG oslo_concurrency.processutils [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwefm2xn" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:18:47 np0005466012 kernel: tapde78e7b6-f8: entered promiscuous mode
Oct  2 08:18:47 np0005466012 NetworkManager[51207]: <info>  [1759407527.4111] manager: (tapde78e7b6-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Oct  2 08:18:47 np0005466012 systemd-udevd[233018]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:47Z|00339|binding|INFO|Claiming lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for this chassis.
Oct  2 08:18:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:47Z|00340|binding|INFO|de78e7b6-f8b9-40fb-bc85-0a8257f52c55: Claiming fa:16:3e:49:12:9b 10.100.0.12
Oct  2 08:18:47 np0005466012 NetworkManager[51207]: <info>  [1759407527.4273] device (tapde78e7b6-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:18:47 np0005466012 NetworkManager[51207]: <info>  [1759407527.4281] device (tapde78e7b6-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.434 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:12:9b 10.100.0.12'], port_security=['fa:16:3e:49:12:9b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fbfcbe40-57a4-4e81-a4a2-bc9c241749fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e895cece-6b67-405e-b05d-5b86ddbf8385', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa03c570c52a4c2a9445090389d03c6d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '86713f8f-e4ad-44d5-8c6e-92e3b3c5f67c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42f687d5-26a0-4ae5-91cd-f49120fff442, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=de78e7b6-f8b9-40fb-bc85-0a8257f52c55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.436 103246 INFO neutron.agent.ovn.metadata.agent [-] Port de78e7b6-f8b9-40fb-bc85-0a8257f52c55 in datapath e895cece-6b67-405e-b05d-5b86ddbf8385 bound to our chassis#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.437 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e895cece-6b67-405e-b05d-5b86ddbf8385#033[00m
Oct  2 08:18:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:47Z|00341|binding|INFO|Setting lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 up in Southbound
Oct  2 08:18:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:47Z|00342|binding|INFO|Setting lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 ovn-installed in OVS
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.455 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c73a80-44f1-454c-bd6f-e2fe3d00d70d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.456 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape895cece-61 in ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.458 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape895cece-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.458 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1031d2-9971-4742-8944-c63f0a27a373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.459 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6c4990-ebcb-4d92-a141-b2e67956ed5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 systemd-machined[152114]: New machine qemu-40-instance-0000005b.
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.474 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[b882f2e7-88f0-492d-8187-047e5974cbb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 podman[233125]: 2025-10-02 12:18:47.490721577 +0000 UTC m=+0.108988481 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:18:47 np0005466012 systemd[1]: Started Virtual Machine qemu-40-instance-0000005b.
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.494 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2298eab9-f00b-4b8f-b402-c9c20ec6b163]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.528 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ac88fe29-475e-4781-9ae0-41a252c8a80c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 NetworkManager[51207]: <info>  [1759407527.5392] manager: (tape895cece-60): new Veth device (/org/freedesktop/NetworkManager/Devices/153)
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.540 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[694e24c2-aa41-4727-98bf-c3f78f1b1a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 podman[233127]: 2025-10-02 12:18:47.560671516 +0000 UTC m=+0.163312642 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.576 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9e51bda3-7bb1-44d3-881b-cb4f4c5313f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.581 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ce311613-1e7b-466b-b45d-e4c59fcec8d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 NetworkManager[51207]: <info>  [1759407527.6031] device (tape895cece-60): carrier: link connected
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.609 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[886d7f2c-8da4-4d7c-a78f-0776ac048672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.627 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6e28f0e3-6b38-4ef0-9494-939abc69472d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape895cece-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:96:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552121, 'reachable_time': 25670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233212, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.645 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aa00d6e4-6356-41b6-9e8e-dcf4d6c7137b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:9629'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552121, 'tstamp': 552121}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233213, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.664 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1311450d-5a9b-4823-8ed2-dd766462ca1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape895cece-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:96:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552121, 'reachable_time': 25670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233214, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.704 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fc17a6-e340-4d84-8f95-16ba9d9f39c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.785 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2492785-e7ac-4e48-86b6-d1ba2f30e268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.787 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape895cece-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.788 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.789 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape895cece-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 NetworkManager[51207]: <info>  [1759407527.7925] manager: (tape895cece-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Oct  2 08:18:47 np0005466012 kernel: tape895cece-60: entered promiscuous mode
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.797 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape895cece-60, col_values=(('external_ids', {'iface-id': '893d58a9-c253-4923-8cf4-03927d247550'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:47Z|00343|binding|INFO|Releasing lport 893d58a9-c253-4923-8cf4-03927d247550 from this chassis (sb_readonly=0)
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 nova_compute[192063]: 2025-10-02 12:18:47.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.826 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e895cece-6b67-405e-b05d-5b86ddbf8385.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e895cece-6b67-405e-b05d-5b86ddbf8385.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.827 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5645180b-1ea0-4572-8386-af4f44f34a4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.829 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-e895cece-6b67-405e-b05d-5b86ddbf8385
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/e895cece-6b67-405e-b05d-5b86ddbf8385.pid.haproxy
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID e895cece-6b67-405e-b05d-5b86ddbf8385
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:18:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:47.833 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'env', 'PROCESS_TAG=haproxy-e895cece-6b67-405e-b05d-5b86ddbf8385', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e895cece-6b67-405e-b05d-5b86ddbf8385.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:18:48 np0005466012 podman[233254]: 2025-10-02 12:18:48.273556941 +0000 UTC m=+0.101651570 container create d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:18:48 np0005466012 podman[233254]: 2025-10-02 12:18:48.199884656 +0000 UTC m=+0.027979305 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:18:48 np0005466012 systemd[1]: Started libpod-conmon-d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047.scope.
Oct  2 08:18:48 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.357 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for fbfcbe40-57a4-4e81-a4a2-bc9c241749fc due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.357 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407528.3546646, fbfcbe40-57a4-4e81-a4a2-bc9c241749fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.358 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:18:48 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e8b92c9fb79522c680fe2b47fe3e2f97fb49cd10a3bd75807dc1f6e7f0eb132/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.371 2 DEBUG nova.compute.manager [None req-efb49940-f8eb-49f6-8f2b-87f0f1cdd5a1 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:48 np0005466012 podman[233254]: 2025-10-02 12:18:48.372224165 +0000 UTC m=+0.200318784 container init d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:18:48 np0005466012 podman[233254]: 2025-10-02 12:18:48.377919389 +0000 UTC m=+0.206014008 container start d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.384 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.387 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:48 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [NOTICE]   (233273) : New worker (233275) forked
Oct  2 08:18:48 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [NOTICE]   (233273) : Loading success.
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.412 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.413 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407528.3547902, fbfcbe40-57a4-4e81-a4a2-bc9c241749fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.413 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] VM Started (Lifecycle Event)#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.440 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.443 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.859 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407513.8577664, cf2c9cb7-1ce8-47ea-baa3-5aed0229e155 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.860 2 INFO nova.compute.manager [-] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.883 2 DEBUG nova.compute.manager [None req-d58728d6-ca6d-455f-b3c6-94b94287c5c2 - - - - - -] [instance: cf2c9cb7-1ce8-47ea-baa3-5aed0229e155] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:18:48 np0005466012 nova_compute[192063]: 2025-10-02 12:18:48.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.345 2 DEBUG nova.compute.manager [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.347 2 DEBUG oslo_concurrency.lockutils [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.347 2 DEBUG oslo_concurrency.lockutils [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.348 2 DEBUG oslo_concurrency.lockutils [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.348 2 DEBUG nova.compute.manager [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.349 2 WARNING nova.compute.manager [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received unexpected event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.350 2 DEBUG nova.compute.manager [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.350 2 DEBUG oslo_concurrency.lockutils [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.351 2 DEBUG oslo_concurrency.lockutils [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.352 2 DEBUG oslo_concurrency.lockutils [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.352 2 DEBUG nova.compute.manager [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:49 np0005466012 nova_compute[192063]: 2025-10-02 12:18:49.353 2 WARNING nova.compute.manager [req-9ecbdae6-6eda-4f6f-9498-2887c5b30f3a req-52e306e4-df83-4ba6-932a-c347984ff4ef 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received unexpected event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:18:50 np0005466012 podman[233284]: 2025-10-02 12:18:50.162375822 +0000 UTC m=+0.070700781 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:18:51 np0005466012 nova_compute[192063]: 2025-10-02 12:18:51.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:51 np0005466012 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:18:51 np0005466012 systemd[232993]: Activating special unit Exit the Session...
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped target Main User Target.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped target Basic System.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped target Paths.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped target Sockets.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped target Timers.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:18:51 np0005466012 systemd[232993]: Closed D-Bus User Message Bus Socket.
Oct  2 08:18:51 np0005466012 systemd[232993]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:18:51 np0005466012 systemd[232993]: Removed slice User Application Slice.
Oct  2 08:18:51 np0005466012 systemd[232993]: Reached target Shutdown.
Oct  2 08:18:51 np0005466012 systemd[232993]: Finished Exit the Session.
Oct  2 08:18:51 np0005466012 systemd[232993]: Reached target Exit the Session.
Oct  2 08:18:51 np0005466012 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:18:51 np0005466012 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:18:51 np0005466012 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:18:51 np0005466012 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:18:51 np0005466012 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:18:51 np0005466012 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:18:51 np0005466012 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:18:53 np0005466012 nova_compute[192063]: 2025-10-02 12:18:53.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:54 np0005466012 podman[233304]: 2025-10-02 12:18:54.167536319 +0000 UTC m=+0.076147708 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.156 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.157 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.157 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.158 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.158 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.167 2 INFO nova.compute.manager [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Terminating instance#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.177 2 DEBUG nova.compute.manager [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:18:55 np0005466012 kernel: tapde78e7b6-f8 (unregistering): left promiscuous mode
Oct  2 08:18:55 np0005466012 NetworkManager[51207]: <info>  [1759407535.2026] device (tapde78e7b6-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:55Z|00344|binding|INFO|Releasing lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 from this chassis (sb_readonly=0)
Oct  2 08:18:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:55Z|00345|binding|INFO|Setting lport de78e7b6-f8b9-40fb-bc85-0a8257f52c55 down in Southbound
Oct  2 08:18:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:18:55Z|00346|binding|INFO|Removing iface tapde78e7b6-f8 ovn-installed in OVS
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.222 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:12:9b 10.100.0.12'], port_security=['fa:16:3e:49:12:9b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fbfcbe40-57a4-4e81-a4a2-bc9c241749fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e895cece-6b67-405e-b05d-5b86ddbf8385', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa03c570c52a4c2a9445090389d03c6d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '86713f8f-e4ad-44d5-8c6e-92e3b3c5f67c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42f687d5-26a0-4ae5-91cd-f49120fff442, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=de78e7b6-f8b9-40fb-bc85-0a8257f52c55) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.224 103246 INFO neutron.agent.ovn.metadata.agent [-] Port de78e7b6-f8b9-40fb-bc85-0a8257f52c55 in datapath e895cece-6b67-405e-b05d-5b86ddbf8385 unbound from our chassis#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.225 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e895cece-6b67-405e-b05d-5b86ddbf8385, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.227 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[268a7dc2-ad04-4cc7-9a28-7d57914d65e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.228 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 namespace which is not needed anymore#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Oct  2 08:18:55 np0005466012 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005b.scope: Consumed 7.577s CPU time.
Oct  2 08:18:55 np0005466012 systemd-machined[152114]: Machine qemu-40-instance-0000005b terminated.
Oct  2 08:18:55 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [NOTICE]   (233273) : haproxy version is 2.8.14-c23fe91
Oct  2 08:18:55 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [NOTICE]   (233273) : path to executable is /usr/sbin/haproxy
Oct  2 08:18:55 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [WARNING]  (233273) : Exiting Master process...
Oct  2 08:18:55 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [ALERT]    (233273) : Current worker (233275) exited with code 143 (Terminated)
Oct  2 08:18:55 np0005466012 neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385[233269]: [WARNING]  (233273) : All workers exited. Exiting... (0)
Oct  2 08:18:55 np0005466012 systemd[1]: libpod-d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047.scope: Deactivated successfully.
Oct  2 08:18:55 np0005466012 conmon[233269]: conmon d95a6c5e979559aed763 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047.scope/container/memory.events
Oct  2 08:18:55 np0005466012 podman[233347]: 2025-10-02 12:18:55.373652821 +0000 UTC m=+0.050879783 container died d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:18:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047-userdata-shm.mount: Deactivated successfully.
Oct  2 08:18:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay-5e8b92c9fb79522c680fe2b47fe3e2f97fb49cd10a3bd75807dc1f6e7f0eb132-merged.mount: Deactivated successfully.
Oct  2 08:18:55 np0005466012 podman[233347]: 2025-10-02 12:18:55.423364188 +0000 UTC m=+0.100591150 container cleanup d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:18:55 np0005466012 systemd[1]: libpod-conmon-d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047.scope: Deactivated successfully.
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.473 2 INFO nova.virt.libvirt.driver [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Instance destroyed successfully.#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.474 2 DEBUG nova.objects.instance [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lazy-loading 'resources' on Instance uuid fbfcbe40-57a4-4e81-a4a2-bc9c241749fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.485 2 DEBUG nova.virt.libvirt.vif [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1845813479',display_name='tempest-ServerRescueNegativeTestJSON-server-1845813479',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1845813479',id=91,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa03c570c52a4c2a9445090389d03c6d',ramdisk_id='',reservation_id='r-n02x5bvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1968496116',owner_user_name='tempest-ServerRescueNegativeTestJSON-1968496116-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:18:48Z,user_data=None,user_id='8c91fa3e559044609ddabc81368d7546',uuid=fbfcbe40-57a4-4e81-a4a2-bc9c241749fc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.486 2 DEBUG nova.network.os_vif_util [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converting VIF {"id": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "address": "fa:16:3e:49:12:9b", "network": {"id": "e895cece-6b67-405e-b05d-5b86ddbf8385", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-117197461-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa03c570c52a4c2a9445090389d03c6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde78e7b6-f8", "ovs_interfaceid": "de78e7b6-f8b9-40fb-bc85-0a8257f52c55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.486 2 DEBUG nova.network.os_vif_util [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.487 2 DEBUG os_vif [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde78e7b6-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.494 2 INFO os_vif [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:12:9b,bridge_name='br-int',has_traffic_filtering=True,id=de78e7b6-f8b9-40fb-bc85-0a8257f52c55,network=Network(e895cece-6b67-405e-b05d-5b86ddbf8385),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde78e7b6-f8')#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.495 2 INFO nova.virt.libvirt.driver [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Deleting instance files /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc_del#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.495 2 INFO nova.virt.libvirt.driver [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Deletion of /var/lib/nova/instances/fbfcbe40-57a4-4e81-a4a2-bc9c241749fc_del complete#033[00m
Oct  2 08:18:55 np0005466012 podman[233393]: 2025-10-02 12:18:55.507345831 +0000 UTC m=+0.052003975 container remove d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.513 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[14999ef3-89c7-4a9d-a91b-29e1faac32d4]: (4, ('Thu Oct  2 12:18:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 (d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047)\nd95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047\nThu Oct  2 12:18:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 (d95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047)\nd95a6c5e979559aed76307b54fd62d7160d1622d83e83958e163002c3d7d9047\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.515 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[85f5fa59-a85f-4931-9612-7b7149c2c69d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.516 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape895cece-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 kernel: tape895cece-60: left promiscuous mode
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.542 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[636a572b-4631-4e01-864a-6b3f55375512]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.555 2 INFO nova.compute.manager [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.556 2 DEBUG oslo.service.loopingcall [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.556 2 DEBUG nova.compute.manager [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:18:55 np0005466012 nova_compute[192063]: 2025-10-02 12:18:55.557 2 DEBUG nova.network.neutron [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.572 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b569511a-e444-4d7e-8172-d4c08e31e082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.573 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[36f39603-c99c-41d6-8274-c8c3d68929d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.587 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5f9b45-ddc3-4962-9162-f786481bad52]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552113, 'reachable_time': 39861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233412, 'error': None, 'target': 'ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.590 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e895cece-6b67-405e-b05d-5b86ddbf8385 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:18:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:18:55.590 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf400c3-8862-42c2-848a-bf68a318f100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:18:55 np0005466012 systemd[1]: run-netns-ovnmeta\x2de895cece\x2d6b67\x2d405e\x2db05d\x2d5b86ddbf8385.mount: Deactivated successfully.
Oct  2 08:18:56 np0005466012 nova_compute[192063]: 2025-10-02 12:18:56.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.391 2 DEBUG nova.compute.manager [req-440d6ac3-4011-4ba5-841b-492631f753b4 req-e7b24355-6151-4628-9016-efad36f55c8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-unplugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.392 2 DEBUG oslo_concurrency.lockutils [req-440d6ac3-4011-4ba5-841b-492631f753b4 req-e7b24355-6151-4628-9016-efad36f55c8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.392 2 DEBUG oslo_concurrency.lockutils [req-440d6ac3-4011-4ba5-841b-492631f753b4 req-e7b24355-6151-4628-9016-efad36f55c8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.392 2 DEBUG oslo_concurrency.lockutils [req-440d6ac3-4011-4ba5-841b-492631f753b4 req-e7b24355-6151-4628-9016-efad36f55c8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.393 2 DEBUG nova.compute.manager [req-440d6ac3-4011-4ba5-841b-492631f753b4 req-e7b24355-6151-4628-9016-efad36f55c8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-unplugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.394 2 DEBUG nova.compute.manager [req-440d6ac3-4011-4ba5-841b-492631f753b4 req-e7b24355-6151-4628-9016-efad36f55c8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-unplugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.563 2 DEBUG nova.network.neutron [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.582 2 INFO nova.compute.manager [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Took 2.02 seconds to deallocate network for instance.#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.674 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.674 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.699 2 DEBUG nova.compute.manager [req-080ac507-2897-4250-bdc6-b7479d4fc114 req-2afd6cb1-2ce1-49b1-92fe-5eec3407085b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-deleted-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.769 2 DEBUG nova.compute.provider_tree [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.785 2 DEBUG nova.scheduler.client.report [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.804 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.823 2 INFO nova.scheduler.client.report [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Deleted allocations for instance fbfcbe40-57a4-4e81-a4a2-bc9c241749fc#033[00m
Oct  2 08:18:57 np0005466012 nova_compute[192063]: 2025-10-02 12:18:57.902 2 DEBUG oslo_concurrency.lockutils [None req-d6f30059-d1eb-46ce-b3c2-dbf4d39b1172 8c91fa3e559044609ddabc81368d7546 fa03c570c52a4c2a9445090389d03c6d - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:59 np0005466012 podman[233413]: 2025-10-02 12:18:59.150178873 +0000 UTC m=+0.057996678 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:18:59 np0005466012 podman[233414]: 2025-10-02 12:18:59.154555358 +0000 UTC m=+0.059753887 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter)
Oct  2 08:18:59 np0005466012 nova_compute[192063]: 2025-10-02 12:18:59.498 2 DEBUG nova.compute.manager [req-ba0bf2dc-a8e0-4c78-985d-99e13c6b0a73 req-61973143-d9ae-4406-9d15-c2b54a6f6fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:18:59 np0005466012 nova_compute[192063]: 2025-10-02 12:18:59.499 2 DEBUG oslo_concurrency.lockutils [req-ba0bf2dc-a8e0-4c78-985d-99e13c6b0a73 req-61973143-d9ae-4406-9d15-c2b54a6f6fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:18:59 np0005466012 nova_compute[192063]: 2025-10-02 12:18:59.499 2 DEBUG oslo_concurrency.lockutils [req-ba0bf2dc-a8e0-4c78-985d-99e13c6b0a73 req-61973143-d9ae-4406-9d15-c2b54a6f6fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:18:59 np0005466012 nova_compute[192063]: 2025-10-02 12:18:59.499 2 DEBUG oslo_concurrency.lockutils [req-ba0bf2dc-a8e0-4c78-985d-99e13c6b0a73 req-61973143-d9ae-4406-9d15-c2b54a6f6fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "fbfcbe40-57a4-4e81-a4a2-bc9c241749fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:18:59 np0005466012 nova_compute[192063]: 2025-10-02 12:18:59.499 2 DEBUG nova.compute.manager [req-ba0bf2dc-a8e0-4c78-985d-99e13c6b0a73 req-61973143-d9ae-4406-9d15-c2b54a6f6fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] No waiting events found dispatching network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:18:59 np0005466012 nova_compute[192063]: 2025-10-02 12:18:59.499 2 WARNING nova.compute.manager [req-ba0bf2dc-a8e0-4c78-985d-99e13c6b0a73 req-61973143-d9ae-4406-9d15-c2b54a6f6fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Received unexpected event network-vif-plugged-de78e7b6-f8b9-40fb-bc85-0a8257f52c55 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:19:00 np0005466012 nova_compute[192063]: 2025-10-02 12:19:00.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:01 np0005466012 nova_compute[192063]: 2025-10-02 12:19:01.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:02.129 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:02.131 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:02.131 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:03 np0005466012 nova_compute[192063]: 2025-10-02 12:19:03.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:04 np0005466012 podman[233456]: 2025-10-02 12:19:04.145652172 +0000 UTC m=+0.051777449 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:19:04 np0005466012 podman[233455]: 2025-10-02 12:19:04.170442384 +0000 UTC m=+0.086019142 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.428 2 DEBUG nova.compute.manager [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.428 2 DEBUG oslo_concurrency.lockutils [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.429 2 DEBUG oslo_concurrency.lockutils [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.429 2 DEBUG oslo_concurrency.lockutils [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.429 2 DEBUG nova.compute.manager [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.429 2 WARNING nova.compute.manager [req-5c30db85-534a-418d-a913-34fe3c6f862f req-e0d2af38-9baa-40b6-be33-6a0feaa9dc2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:19:05 np0005466012 nova_compute[192063]: 2025-10-02 12:19:05.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:06 np0005466012 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:19:06 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:19:06 np0005466012 systemd-logind[827]: New session 36 of user nova.
Oct  2 08:19:06 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:19:06 np0005466012 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:19:06 np0005466012 systemd[233502]: Queued start job for default target Main User Target.
Oct  2 08:19:06 np0005466012 systemd[233502]: Created slice User Application Slice.
Oct  2 08:19:06 np0005466012 systemd[233502]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:19:06 np0005466012 systemd[233502]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:19:06 np0005466012 systemd[233502]: Reached target Paths.
Oct  2 08:19:06 np0005466012 systemd[233502]: Reached target Timers.
Oct  2 08:19:06 np0005466012 systemd[233502]: Starting D-Bus User Message Bus Socket...
Oct  2 08:19:06 np0005466012 systemd[233502]: Starting Create User's Volatile Files and Directories...
Oct  2 08:19:06 np0005466012 systemd[233502]: Finished Create User's Volatile Files and Directories.
Oct  2 08:19:06 np0005466012 systemd[233502]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:19:06 np0005466012 systemd[233502]: Reached target Sockets.
Oct  2 08:19:06 np0005466012 systemd[233502]: Reached target Basic System.
Oct  2 08:19:06 np0005466012 systemd[233502]: Reached target Main User Target.
Oct  2 08:19:06 np0005466012 systemd[233502]: Startup finished in 127ms.
Oct  2 08:19:06 np0005466012 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:19:06 np0005466012 systemd[1]: Started Session 36 of User nova.
Oct  2 08:19:06 np0005466012 nova_compute[192063]: 2025-10-02 12:19:06.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:06 np0005466012 systemd[1]: session-36.scope: Deactivated successfully.
Oct  2 08:19:06 np0005466012 systemd-logind[827]: Session 36 logged out. Waiting for processes to exit.
Oct  2 08:19:06 np0005466012 systemd-logind[827]: Removed session 36.
Oct  2 08:19:06 np0005466012 systemd-logind[827]: New session 38 of user nova.
Oct  2 08:19:06 np0005466012 systemd[1]: Started Session 38 of User nova.
Oct  2 08:19:06 np0005466012 systemd[1]: session-38.scope: Deactivated successfully.
Oct  2 08:19:06 np0005466012 systemd-logind[827]: Session 38 logged out. Waiting for processes to exit.
Oct  2 08:19:06 np0005466012 systemd-logind[827]: Removed session 38.
Oct  2 08:19:07 np0005466012 systemd-logind[827]: New session 39 of user nova.
Oct  2 08:19:07 np0005466012 systemd[1]: Started Session 39 of User nova.
Oct  2 08:19:07 np0005466012 systemd[1]: session-39.scope: Deactivated successfully.
Oct  2 08:19:07 np0005466012 systemd-logind[827]: Session 39 logged out. Waiting for processes to exit.
Oct  2 08:19:07 np0005466012 systemd-logind[827]: Removed session 39.
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.677 2 DEBUG nova.compute.manager [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.679 2 DEBUG oslo_concurrency.lockutils [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.679 2 DEBUG oslo_concurrency.lockutils [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.679 2 DEBUG oslo_concurrency.lockutils [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.679 2 DEBUG nova.compute.manager [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.680 2 WARNING nova.compute.manager [req-4f479535-df6d-43b4-afec-ebf87c4b6a5d req-d09f92af-1cd6-4b0b-95d9-bd59fa9137bc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  2 08:19:07 np0005466012 nova_compute[192063]: 2025-10-02 12:19:07.805 2 INFO nova.network.neutron [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating port 4b6da309-2e2d-465d-91bd-9e0bae3250eb with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:19:08 np0005466012 nova_compute[192063]: 2025-10-02 12:19:08.841 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:08 np0005466012 nova_compute[192063]: 2025-10-02 12:19:08.842 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:08 np0005466012 nova_compute[192063]: 2025-10-02 12:19:08.842 2 DEBUG nova.network.neutron [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:09 np0005466012 nova_compute[192063]: 2025-10-02 12:19:09.091 2 DEBUG nova.compute.manager [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-changed-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:09 np0005466012 nova_compute[192063]: 2025-10-02 12:19:09.092 2 DEBUG nova.compute.manager [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Refreshing instance network info cache due to event network-changed-4b6da309-2e2d-465d-91bd-9e0bae3250eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:19:09 np0005466012 nova_compute[192063]: 2025-10-02 12:19:09.092 2 DEBUG oslo_concurrency.lockutils [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.332 2 DEBUG nova.network.neutron [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.365 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.370 2 DEBUG oslo_concurrency.lockutils [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.370 2 DEBUG nova.network.neutron [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Refreshing network info cache for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.473 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407535.47201, fbfcbe40-57a4-4e81-a4a2-bc9c241749fc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.473 2 INFO nova.compute.manager [-] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.487 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.489 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.490 2 INFO nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Creating image(s)#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.491 2 DEBUG nova.objects.instance [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.494 2 DEBUG nova.compute.manager [None req-34e4cf9f-4aa5-40a5-b8ae-f416d8a93474 - - - - - -] [instance: fbfcbe40-57a4-4e81-a4a2-bc9c241749fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.502 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.593 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.595 2 DEBUG nova.virt.disk.api [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Checking if we can resize image /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.595 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.655 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.656 2 DEBUG nova.virt.disk.api [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Cannot resize image /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.673 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.673 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Ensure instance console log exists: /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.674 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.674 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.674 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.677 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Start _get_guest_xml network_info=[{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.680 2 WARNING nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.687 2 DEBUG nova.virt.libvirt.host [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.687 2 DEBUG nova.virt.libvirt.host [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.690 2 DEBUG nova.virt.libvirt.host [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.690 2 DEBUG nova.virt.libvirt.host [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.691 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.691 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:25Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9949d9da-6314-4ede-8797-6f2f0a6a64fc',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.692 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.692 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.692 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.693 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.693 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.693 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.693 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.694 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.694 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.694 2 DEBUG nova.virt.hardware [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.695 2 DEBUG nova.objects.instance [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.713 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.768 2 DEBUG oslo_concurrency.processutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.769 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.770 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.770 2 DEBUG oslo_concurrency.lockutils [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.771 2 DEBUG nova.virt.libvirt.vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:07Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.772 2 DEBUG nova.network.os_vif_util [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.772 2 DEBUG nova.network.os_vif_util [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.774 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <uuid>95da5a4a-5301-4a2b-b135-01e08486477d</uuid>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <name>instance-0000005f</name>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <memory>196608</memory>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:name>tempest-DeleteServersTestJSON-server-2096752872</nova:name>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:19:10</nova:creationTime>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.micro">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:memory>192</nova:memory>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:user uuid="0c0ba8ddde504431b51e593c63f40361">tempest-DeleteServersTestJSON-548982240-project-member</nova:user>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:project uuid="d5db64e6714348c1a7f57bb53de80915">tempest-DeleteServersTestJSON-548982240</nova:project>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        <nova:port uuid="4b6da309-2e2d-465d-91bd-9e0bae3250eb">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <entry name="serial">95da5a4a-5301-4a2b-b135-01e08486477d</entry>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <entry name="uuid">95da5a4a-5301-4a2b-b135-01e08486477d</entry>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/disk.config"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:5d:9c:dc"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <target dev="tap4b6da309-2e"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d/console.log" append="off"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:19:10 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:19:10 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:19:10 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:19:10 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.775 2 DEBUG nova.virt.libvirt.vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:18:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:07Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.776 2 DEBUG nova.network.os_vif_util [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1299594383-network", "vif_mac": "fa:16:3e:5d:9c:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.776 2 DEBUG nova.network.os_vif_util [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.776 2 DEBUG os_vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.778 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.780 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b6da309-2e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.780 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b6da309-2e, col_values=(('external_ids', {'iface-id': '4b6da309-2e2d-465d-91bd-9e0bae3250eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:9c:dc', 'vm-uuid': '95da5a4a-5301-4a2b-b135-01e08486477d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:10 np0005466012 NetworkManager[51207]: <info>  [1759407550.7830] manager: (tap4b6da309-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.791 2 INFO os_vif [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e')#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.998 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.998 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.999 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] No VIF found with MAC fa:16:3e:5d:9c:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:19:10 np0005466012 nova_compute[192063]: 2025-10-02 12:19:10.999 2 INFO nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Using config drive#033[00m
Oct  2 08:19:11 np0005466012 kernel: tap4b6da309-2e: entered promiscuous mode
Oct  2 08:19:11 np0005466012 NetworkManager[51207]: <info>  [1759407551.0598] manager: (tap4b6da309-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Oct  2 08:19:11 np0005466012 systemd-udevd[233553]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:11Z|00347|binding|INFO|Claiming lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb for this chassis.
Oct  2 08:19:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:11Z|00348|binding|INFO|4b6da309-2e2d-465d-91bd-9e0bae3250eb: Claiming fa:16:3e:5d:9c:dc 10.100.0.8
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 NetworkManager[51207]: <info>  [1759407551.1158] device (tap4b6da309-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:19:11 np0005466012 NetworkManager[51207]: <info>  [1759407551.1165] device (tap4b6da309-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.124 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:9c:dc 10.100.0.8'], port_security=['fa:16:3e:5d:9c:dc 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '95da5a4a-5301-4a2b-b135-01e08486477d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '6', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4b6da309-2e2d-465d-91bd-9e0bae3250eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.126 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4b6da309-2e2d-465d-91bd-9e0bae3250eb in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 bound to our chassis#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.128 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97b8849-844c-4190-8b13-fd7a2d073ce8#033[00m
Oct  2 08:19:11 np0005466012 systemd-machined[152114]: New machine qemu-41-instance-0000005f.
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.144 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5225e675-d309-4c21-ab64-7ad9fe7b6356]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.145 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97b8849-81 in ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.147 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97b8849-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.147 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8764716c-8141-4793-a6e8-1e8df0dc6175]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.148 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e0864007-c7d8-41db-b4a2-bc9c81c10f5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.162 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[6b81d426-0d63-41d2-9124-fd69d0b82732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 systemd[1]: Started Virtual Machine qemu-41-instance-0000005f.
Oct  2 08:19:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:11Z|00349|binding|INFO|Setting lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb ovn-installed in OVS
Oct  2 08:19:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:11Z|00350|binding|INFO|Setting lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb up in Southbound
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.189 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f198fe-5a6e-490d-b7df-644f5a613c2a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.216 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e37f30c6-73f3-47a5-bcec-68d6c833bedc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 NetworkManager[51207]: <info>  [1759407551.2230] manager: (tapb97b8849-80): new Veth device (/org/freedesktop/NetworkManager/Devices/157)
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.221 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f3c065-d86d-45b0-9e4f-940a792e9e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.250 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ad885be0-6d47-4003-b163-aaf07ac2ff1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.253 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[bad5003a-a2e1-4a10-945f-2cfd79a42b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 NetworkManager[51207]: <info>  [1759407551.2774] device (tapb97b8849-80): carrier: link connected
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.286 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b6441b-acde-4a28-bcbe-ea27a7e97878]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.306 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a54c846f-a812-4105-a1d6-1d037239e279]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554488, 'reachable_time': 33252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233589, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.325 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cbe516-7a91-4d10-aa66-5e7e24a05c67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:e0b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554488, 'tstamp': 554488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233590, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.351 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[94c7e0d6-4c9f-4581-b8ec-0c0c8dbff310]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97b8849-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:e0:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554488, 'reachable_time': 33252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233591, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.382 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e12114bd-d912-4191-baa8-3f354a91c006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.448 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[11b40dd6-6201-4048-b54b-6a58187aff19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.449 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.450 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.450 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97b8849-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 kernel: tapb97b8849-80: entered promiscuous mode
Oct  2 08:19:11 np0005466012 NetworkManager[51207]: <info>  [1759407551.4527] manager: (tapb97b8849-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.458 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97b8849-80, col_values=(('external_ids', {'iface-id': '055cf080-4472-4807-a697-69de84e96953'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:11Z|00351|binding|INFO|Releasing lport 055cf080-4472-4807-a697-69de84e96953 from this chassis (sb_readonly=0)
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.472 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.473 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[60f02c43-a99f-4838-8649-ab2f3d95263b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.474 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/b97b8849-844c-4190-8b13-fd7a2d073ce8.pid.haproxy
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID b97b8849-844c-4190-8b13-fd7a2d073ce8
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:19:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:11.474 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'env', 'PROCESS_TAG=haproxy-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97b8849-844c-4190-8b13-fd7a2d073ce8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.682 2 DEBUG nova.compute.manager [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.682 2 DEBUG oslo_concurrency.lockutils [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.683 2 DEBUG oslo_concurrency.lockutils [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.683 2 DEBUG oslo_concurrency.lockutils [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.683 2 DEBUG nova.compute.manager [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:11 np0005466012 nova_compute[192063]: 2025-10-02 12:19:11.683 2 WARNING nova.compute.manager [req-1e1bff55-b51c-4844-a175-7f87913ce7c9 req-e294ac95-31d1-4732-9283-564f24af231e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:19:11 np0005466012 podman[233629]: 2025-10-02 12:19:11.803825582 +0000 UTC m=+0.020717917 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:19:12 np0005466012 podman[233629]: 2025-10-02 12:19:12.197370295 +0000 UTC m=+0.414262610 container create e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:19:12 np0005466012 systemd[1]: Started libpod-conmon-e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3.scope.
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.240 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407552.239592, 95da5a4a-5301-4a2b-b135-01e08486477d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.240 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.245 2 DEBUG nova.compute.manager [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:19:12 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.248 2 INFO nova.virt.libvirt.driver [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance running successfully.#033[00m
Oct  2 08:19:12 np0005466012 virtqemud[191783]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.250 2 DEBUG nova.virt.libvirt.guest [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.250 2 DEBUG nova.virt.libvirt.driver [None req-0a5dfca4-1797-4ea1-900e-feac822ee6fc 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 08:19:12 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89e88695587a0827eeecd3fbc77c29f973a4a650b5887f56349c47769c84486e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:19:12 np0005466012 podman[233629]: 2025-10-02 12:19:12.28807597 +0000 UTC m=+0.504968315 container init e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.292 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:12 np0005466012 podman[233629]: 2025-10-02 12:19:12.293762934 +0000 UTC m=+0.510655249 container start e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.298 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:12 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [NOTICE]   (233649) : New worker (233651) forked
Oct  2 08:19:12 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [NOTICE]   (233649) : Loading success.
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.373 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.373 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407552.2396698, 95da5a4a-5301-4a2b-b135-01e08486477d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.374 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.396 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:12 np0005466012 nova_compute[192063]: 2025-10-02 12:19:12.400 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.313 2 DEBUG nova.network.neutron [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updated VIF entry in instance network info cache for port 4b6da309-2e2d-465d-91bd-9e0bae3250eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.314 2 DEBUG nova.network.neutron [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [{"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.336 2 DEBUG oslo_concurrency.lockutils [req-0e09b708-c3a2-45b6-bee2-bd951a276f70 req-aa8977db-db4f-4646-9b43-a7721d56a961 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-95da5a4a-5301-4a2b-b135-01e08486477d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.826 2 DEBUG nova.compute.manager [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.826 2 DEBUG oslo_concurrency.lockutils [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.826 2 DEBUG oslo_concurrency.lockutils [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.827 2 DEBUG oslo_concurrency.lockutils [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.827 2 DEBUG nova.compute.manager [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:13 np0005466012 nova_compute[192063]: 2025-10-02 12:19:13.827 2 WARNING nova.compute.manager [req-38deb2bd-dee3-4ff5-87a9-bb228ddf77fb req-9d89871b-8b42-46e8-bf41-90bf3878dbbc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state resized and task_state deleting.#033[00m
Oct  2 08:19:15 np0005466012 nova_compute[192063]: 2025-10-02 12:19:15.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.303 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.304 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.304 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.304 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.305 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.319 2 INFO nova.compute.manager [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Terminating instance#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.330 2 DEBUG nova.compute.manager [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:19:16 np0005466012 kernel: tap4b6da309-2e (unregistering): left promiscuous mode
Oct  2 08:19:16 np0005466012 NetworkManager[51207]: <info>  [1759407556.3611] device (tap4b6da309-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:16Z|00352|binding|INFO|Releasing lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb from this chassis (sb_readonly=0)
Oct  2 08:19:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:16Z|00353|binding|INFO|Setting lport 4b6da309-2e2d-465d-91bd-9e0bae3250eb down in Southbound
Oct  2 08:19:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:16Z|00354|binding|INFO|Removing iface tap4b6da309-2e ovn-installed in OVS
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.386 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:9c:dc 10.100.0.8'], port_security=['fa:16:3e:5d:9c:dc 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '95da5a4a-5301-4a2b-b135-01e08486477d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5db64e6714348c1a7f57bb53de80915', 'neutron:revision_number': '8', 'neutron:security_group_ids': '063f732a-6071-414f-814d-a5d6c4e9e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2011b0da-7062-465f-963e-59e92e88a653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4b6da309-2e2d-465d-91bd-9e0bae3250eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.388 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4b6da309-2e2d-465d-91bd-9e0bae3250eb in datapath b97b8849-844c-4190-8b13-fd7a2d073ce8 unbound from our chassis#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.391 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97b8849-844c-4190-8b13-fd7a2d073ce8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.392 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[50a4e11b-3ce9-4e67-99cc-e02101f5b87e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.394 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 namespace which is not needed anymore#033[00m
Oct  2 08:19:16 np0005466012 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Oct  2 08:19:16 np0005466012 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005f.scope: Consumed 5.079s CPU time.
Oct  2 08:19:16 np0005466012 systemd-machined[152114]: Machine qemu-41-instance-0000005f terminated.
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [NOTICE]   (233649) : haproxy version is 2.8.14-c23fe91
Oct  2 08:19:16 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [NOTICE]   (233649) : path to executable is /usr/sbin/haproxy
Oct  2 08:19:16 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [WARNING]  (233649) : Exiting Master process...
Oct  2 08:19:16 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [ALERT]    (233649) : Current worker (233651) exited with code 143 (Terminated)
Oct  2 08:19:16 np0005466012 neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8[233645]: [WARNING]  (233649) : All workers exited. Exiting... (0)
Oct  2 08:19:16 np0005466012 systemd[1]: libpod-e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3.scope: Deactivated successfully.
Oct  2 08:19:16 np0005466012 podman[233690]: 2025-10-02 12:19:16.547482969 +0000 UTC m=+0.052885570 container died e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:19:16 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3-userdata-shm.mount: Deactivated successfully.
Oct  2 08:19:16 np0005466012 systemd[1]: var-lib-containers-storage-overlay-89e88695587a0827eeecd3fbc77c29f973a4a650b5887f56349c47769c84486e-merged.mount: Deactivated successfully.
Oct  2 08:19:16 np0005466012 podman[233690]: 2025-10-02 12:19:16.593312225 +0000 UTC m=+0.098714806 container cleanup e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:19:16 np0005466012 systemd[1]: libpod-conmon-e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3.scope: Deactivated successfully.
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.604 2 INFO nova.virt.libvirt.driver [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Instance destroyed successfully.#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.606 2 DEBUG nova.objects.instance [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lazy-loading 'resources' on Instance uuid 95da5a4a-5301-4a2b-b135-01e08486477d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.635 2 DEBUG nova.virt.libvirt.vif [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2096752872',display_name='tempest-DeleteServersTestJSON-server-2096752872',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-2096752872',id=95,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5db64e6714348c1a7f57bb53de80915',ramdisk_id='',reservation_id='r-frz7g55l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-548982240',owner_user_name='tempest-DeleteServersTestJSON-548982240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:12Z,user_data=None,user_id='0c0ba8ddde504431b51e593c63f40361',uuid=95da5a4a-5301-4a2b-b135-01e08486477d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.636 2 DEBUG nova.network.os_vif_util [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converting VIF {"id": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "address": "fa:16:3e:5d:9c:dc", "network": {"id": "b97b8849-844c-4190-8b13-fd7a2d073ce8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1299594383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5db64e6714348c1a7f57bb53de80915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b6da309-2e", "ovs_interfaceid": "4b6da309-2e2d-465d-91bd-9e0bae3250eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.636 2 DEBUG nova.network.os_vif_util [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.637 2 DEBUG os_vif [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b6da309-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.644 2 INFO os_vif [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9c:dc,bridge_name='br-int',has_traffic_filtering=True,id=4b6da309-2e2d-465d-91bd-9e0bae3250eb,network=Network(b97b8849-844c-4190-8b13-fd7a2d073ce8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b6da309-2e')#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.645 2 INFO nova.virt.libvirt.driver [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Deleting instance files /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_del#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.652 2 INFO nova.virt.libvirt.driver [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Deletion of /var/lib/nova/instances/95da5a4a-5301-4a2b-b135-01e08486477d_del complete#033[00m
Oct  2 08:19:16 np0005466012 podman[233735]: 2025-10-02 12:19:16.657928431 +0000 UTC m=+0.038171377 container remove e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.662 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3707cf-97e9-40ed-9522-2d0efe4f88e7]: (4, ('Thu Oct  2 12:19:16 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3)\ne9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3\nThu Oct  2 12:19:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 (e9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3)\ne9673800ee8f4e2b54292377729aa3cae6732b4619789846f317d9b452b4caa3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.664 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8856b9e4-97dd-4458-b67c-8153d606a28f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.665 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97b8849-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 kernel: tapb97b8849-80: left promiscuous mode
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.681 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[22fd3137-34db-4193-89ce-c3bee6923a09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.686 2 DEBUG nova.compute.manager [req-6fb1d023-56a7-413f-97a8-be6cd7fef99c req-6f5aca65-ed43-412c-8730-e33e45a6f21a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.686 2 DEBUG oslo_concurrency.lockutils [req-6fb1d023-56a7-413f-97a8-be6cd7fef99c req-6f5aca65-ed43-412c-8730-e33e45a6f21a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.687 2 DEBUG oslo_concurrency.lockutils [req-6fb1d023-56a7-413f-97a8-be6cd7fef99c req-6f5aca65-ed43-412c-8730-e33e45a6f21a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.687 2 DEBUG oslo_concurrency.lockutils [req-6fb1d023-56a7-413f-97a8-be6cd7fef99c req-6f5aca65-ed43-412c-8730-e33e45a6f21a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.687 2 DEBUG nova.compute.manager [req-6fb1d023-56a7-413f-97a8-be6cd7fef99c req-6f5aca65-ed43-412c-8730-e33e45a6f21a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.688 2 WARNING nova.compute.manager [req-6fb1d023-56a7-413f-97a8-be6cd7fef99c req-6f5aca65-ed43-412c-8730-e33e45a6f21a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-unplugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state active and task_state None.#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.709 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a3508cc0-e368-419c-8bfe-6b184f52b77b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.710 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[04dfc19f-0658-4dd1-9f7b-459a6f0057f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.727 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2120980a-c079-490e-a6f2-3fa55964c175]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554482, 'reachable_time': 17154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233748, 'error': None, 'target': 'ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.729 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97b8849-844c-4190-8b13-fd7a2d073ce8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:19:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:16.730 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[cf64a1ec-a516-45b7-8795-b8789381701b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:16 np0005466012 systemd[1]: run-netns-ovnmeta\x2db97b8849\x2d844c\x2d4190\x2d8b13\x2dfd7a2d073ce8.mount: Deactivated successfully.
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.777 2 INFO nova.compute.manager [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.779 2 DEBUG oslo.service.loopingcall [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.779 2 DEBUG nova.compute.manager [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:19:16 np0005466012 nova_compute[192063]: 2025-10-02 12:19:16.780 2 DEBUG nova.network.neutron [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:19:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:19:17 np0005466012 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:19:17 np0005466012 systemd[233502]: Activating special unit Exit the Session...
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped target Main User Target.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped target Basic System.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped target Paths.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped target Sockets.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped target Timers.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:19:17 np0005466012 systemd[233502]: Closed D-Bus User Message Bus Socket.
Oct  2 08:19:17 np0005466012 systemd[233502]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:19:17 np0005466012 systemd[233502]: Removed slice User Application Slice.
Oct  2 08:19:17 np0005466012 systemd[233502]: Reached target Shutdown.
Oct  2 08:19:17 np0005466012 systemd[233502]: Finished Exit the Session.
Oct  2 08:19:17 np0005466012 systemd[233502]: Reached target Exit the Session.
Oct  2 08:19:17 np0005466012 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:19:17 np0005466012 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:19:17 np0005466012 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:19:17 np0005466012 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:19:17 np0005466012 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:19:17 np0005466012 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:19:17 np0005466012 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.756 2 DEBUG nova.network.neutron [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.774 2 INFO nova.compute.manager [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Took 0.99 seconds to deallocate network for instance.#033[00m
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.841 2 DEBUG nova.compute.manager [req-4638d414-0fa4-4e06-ae0e-8562c5da94f4 req-b3e8fbe3-d2e8-46d2-b601-221f10d93684 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-deleted-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.880 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.881 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.887 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:17 np0005466012 nova_compute[192063]: 2025-10-02 12:19:17.924 2 INFO nova.scheduler.client.report [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Deleted allocations for instance 95da5a4a-5301-4a2b-b135-01e08486477d#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.031 2 DEBUG oslo_concurrency.lockutils [None req-8df60e1e-56c2-4e33-8938-21344a5278c8 0c0ba8ddde504431b51e593c63f40361 d5db64e6714348c1a7f57bb53de80915 - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:18 np0005466012 podman[233752]: 2025-10-02 12:19:18.143346336 +0000 UTC m=+0.055817985 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:19:18 np0005466012 podman[233753]: 2025-10-02 12:19:18.192492417 +0000 UTC m=+0.099472268 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.490 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.491 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.509 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.591 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.592 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.601 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.601 2 INFO nova.compute.claims [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.727 2 DEBUG nova.compute.provider_tree [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.740 2 DEBUG nova.scheduler.client.report [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.758 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.759 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.763 2 DEBUG nova.compute.manager [req-8fdfc4c6-52ea-4146-a6a5-56a8d41e8c3c req-8b7ab352-2b9d-4b7e-9209-b0707d316a13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.763 2 DEBUG oslo_concurrency.lockutils [req-8fdfc4c6-52ea-4146-a6a5-56a8d41e8c3c req-8b7ab352-2b9d-4b7e-9209-b0707d316a13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.764 2 DEBUG oslo_concurrency.lockutils [req-8fdfc4c6-52ea-4146-a6a5-56a8d41e8c3c req-8b7ab352-2b9d-4b7e-9209-b0707d316a13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.764 2 DEBUG oslo_concurrency.lockutils [req-8fdfc4c6-52ea-4146-a6a5-56a8d41e8c3c req-8b7ab352-2b9d-4b7e-9209-b0707d316a13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "95da5a4a-5301-4a2b-b135-01e08486477d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.764 2 DEBUG nova.compute.manager [req-8fdfc4c6-52ea-4146-a6a5-56a8d41e8c3c req-8b7ab352-2b9d-4b7e-9209-b0707d316a13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] No waiting events found dispatching network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.764 2 WARNING nova.compute.manager [req-8fdfc4c6-52ea-4146-a6a5-56a8d41e8c3c req-8b7ab352-2b9d-4b7e-9209-b0707d316a13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Received unexpected event network-vif-plugged-4b6da309-2e2d-465d-91bd-9e0bae3250eb for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.804 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.805 2 DEBUG nova.network.neutron [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.831 2 INFO nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.850 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.972 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.973 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.973 2 INFO nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Creating image(s)#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.974 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.974 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.974 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:18 np0005466012 nova_compute[192063]: 2025-10-02 12:19:18.987 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.044 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.045 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.045 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.059 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.149 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.150 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.191 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.192 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.193 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.265 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.266 2 DEBUG nova.virt.disk.api [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Checking if we can resize image /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.266 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.289 2 DEBUG nova.policy [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d5b74212e6414eaaf46792bfc0310b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '11891f3ac4634e07b72041e075ad5323', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.332 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.333 2 DEBUG nova.virt.disk.api [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Cannot resize image /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.334 2 DEBUG nova.objects.instance [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'migration_context' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.353 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.353 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Ensure instance console log exists: /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.354 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.354 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:19 np0005466012 nova_compute[192063]: 2025-10-02 12:19:19.355 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:20 np0005466012 nova_compute[192063]: 2025-10-02 12:19:20.588 2 DEBUG nova.network.neutron [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Successfully created port: 28b49f60-75ed-4b04-86cf-6b06f398c145 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:19:20 np0005466012 nova_compute[192063]: 2025-10-02 12:19:20.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:21 np0005466012 podman[233818]: 2025-10-02 12:19:21.134548778 +0000 UTC m=+0.049163913 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:19:21 np0005466012 nova_compute[192063]: 2025-10-02 12:19:21.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:21 np0005466012 nova_compute[192063]: 2025-10-02 12:19:21.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.349 2 DEBUG nova.network.neutron [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Successfully updated port: 28b49f60-75ed-4b04-86cf-6b06f398c145 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.375 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.375 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquired lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.375 2 DEBUG nova.network.neutron [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.444 2 DEBUG nova.compute.manager [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-changed-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.445 2 DEBUG nova.compute.manager [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Refreshing instance network info cache due to event network-changed-28b49f60-75ed-4b04-86cf-6b06f398c145. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.445 2 DEBUG oslo_concurrency.lockutils [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:22 np0005466012 nova_compute[192063]: 2025-10-02 12:19:22.561 2 DEBUG nova.network.neutron [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:19:23 np0005466012 nova_compute[192063]: 2025-10-02 12:19:23.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:23 np0005466012 nova_compute[192063]: 2025-10-02 12:19:23.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.068 2 DEBUG nova.network.neutron [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Updating instance_info_cache with network_info: [{"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.103 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Releasing lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.104 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance network_info: |[{"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.104 2 DEBUG oslo_concurrency.lockutils [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.105 2 DEBUG nova.network.neutron [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Refreshing network info cache for port 28b49f60-75ed-4b04-86cf-6b06f398c145 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.108 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Start _get_guest_xml network_info=[{"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.112 2 WARNING nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.117 2 DEBUG nova.virt.libvirt.host [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.118 2 DEBUG nova.virt.libvirt.host [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.130 2 DEBUG nova.virt.libvirt.host [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.131 2 DEBUG nova.virt.libvirt.host [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.132 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.133 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.133 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.134 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.134 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.134 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.134 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.135 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.135 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.135 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.136 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.136 2 DEBUG nova.virt.hardware [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.140 2 DEBUG nova.virt.libvirt.vif [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1806936658',display_name='tempest-InstanceActionsTestJSON-server-1806936658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1806936658',id=97,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11891f3ac4634e07b72041e075ad5323',ramdisk_id='',reservation_id='r-4l83lw6w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-816248192',owner_user_name='tempest-InstanceActionsTestJSON-816248192-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:18Z,user_data=None,user_id='b6d5b74212e6414eaaf46792bfc0310b',uuid=8e445940-a288-443c-868f-ae4f71577933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.140 2 DEBUG nova.network.os_vif_util [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converting VIF {"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.141 2 DEBUG nova.network.os_vif_util [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.142 2 DEBUG nova.objects.instance [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.157 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <uuid>8e445940-a288-443c-868f-ae4f71577933</uuid>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <name>instance-00000061</name>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:name>tempest-InstanceActionsTestJSON-server-1806936658</nova:name>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:19:24</nova:creationTime>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:user uuid="b6d5b74212e6414eaaf46792bfc0310b">tempest-InstanceActionsTestJSON-816248192-project-member</nova:user>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:project uuid="11891f3ac4634e07b72041e075ad5323">tempest-InstanceActionsTestJSON-816248192</nova:project>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        <nova:port uuid="28b49f60-75ed-4b04-86cf-6b06f398c145">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <entry name="serial">8e445940-a288-443c-868f-ae4f71577933</entry>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <entry name="uuid">8e445940-a288-443c-868f-ae4f71577933</entry>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:65:ca:27"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <target dev="tap28b49f60-75"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/console.log" append="off"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:19:24 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:19:24 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:19:24 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:19:24 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.158 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Preparing to wait for external event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.159 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.159 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.159 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.160 2 DEBUG nova.virt.libvirt.vif [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1806936658',display_name='tempest-InstanceActionsTestJSON-server-1806936658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1806936658',id=97,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11891f3ac4634e07b72041e075ad5323',ramdisk_id='',reservation_id='r-4l83lw6w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-816248192',owner_user_name='tempest-InstanceActionsTestJSON-816248192-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:18Z,user_data=None,user_id='b6d5b74212e6414eaaf46792bfc0310b',uuid=8e445940-a288-443c-868f-ae4f71577933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.160 2 DEBUG nova.network.os_vif_util [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converting VIF {"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.160 2 DEBUG nova.network.os_vif_util [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.160 2 DEBUG os_vif [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.161 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.161 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.165 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28b49f60-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.165 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap28b49f60-75, col_values=(('external_ids', {'iface-id': '28b49f60-75ed-4b04-86cf-6b06f398c145', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:ca:27', 'vm-uuid': '8e445940-a288-443c-868f-ae4f71577933'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:24 np0005466012 NetworkManager[51207]: <info>  [1759407564.1682] manager: (tap28b49f60-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.173 2 INFO os_vif [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75')#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.241 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.242 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.242 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] No VIF found with MAC fa:16:3e:65:ca:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.243 2 INFO nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Using config drive#033[00m
Oct  2 08:19:24 np0005466012 podman[233844]: 2025-10-02 12:19:24.257453495 +0000 UTC m=+0.057025058 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:24 np0005466012 nova_compute[192063]: 2025-10-02 12:19:24.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.299 2 INFO nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Creating config drive at /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.304 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuuxo1af9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.431 2 DEBUG oslo_concurrency.processutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuuxo1af9" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 NetworkManager[51207]: <info>  [1759407565.4870] manager: (tap28b49f60-75): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Oct  2 08:19:25 np0005466012 kernel: tap28b49f60-75: entered promiscuous mode
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:25Z|00355|binding|INFO|Claiming lport 28b49f60-75ed-4b04-86cf-6b06f398c145 for this chassis.
Oct  2 08:19:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:25Z|00356|binding|INFO|28b49f60-75ed-4b04-86cf-6b06f398c145: Claiming fa:16:3e:65:ca:27 10.100.0.3
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.502 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:ca:27 10.100.0.3'], port_security=['fa:16:3e:65:ca:27 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8e445940-a288-443c-868f-ae4f71577933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19f283e-171b-4ff9-a708-d3cae938c528', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11891f3ac4634e07b72041e075ad5323', 'neutron:revision_number': '2', 'neutron:security_group_ids': '79e15997-ca89-439a-aee9-254a6fea676b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c00db673-c1c0-41ea-b853-0f498885fd63, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=28b49f60-75ed-4b04-86cf-6b06f398c145) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.503 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 28b49f60-75ed-4b04-86cf-6b06f398c145 in datapath f19f283e-171b-4ff9-a708-d3cae938c528 bound to our chassis#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.505 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f19f283e-171b-4ff9-a708-d3cae938c528#033[00m
Oct  2 08:19:25 np0005466012 systemd-udevd[233883]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.521 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[053452bf-f698-4bbc-a92c-665887eaaad6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.524 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf19f283e-11 in ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.526 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf19f283e-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.526 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4889617f-2085-404c-b36b-4ce9f72f1757]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.527 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[899c311e-0657-45b1-aaaa-490a9356fc8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 systemd-machined[152114]: New machine qemu-42-instance-00000061.
Oct  2 08:19:25 np0005466012 NetworkManager[51207]: <info>  [1759407565.5315] device (tap28b49f60-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:19:25 np0005466012 NetworkManager[51207]: <info>  [1759407565.5329] device (tap28b49f60-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:19:25 np0005466012 systemd[1]: Started Virtual Machine qemu-42-instance-00000061.
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.543 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[394a1970-72c4-4dff-8cba-ba21fcdf0d44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:25Z|00357|binding|INFO|Setting lport 28b49f60-75ed-4b04-86cf-6b06f398c145 ovn-installed in OVS
Oct  2 08:19:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:25Z|00358|binding|INFO|Setting lport 28b49f60-75ed-4b04-86cf-6b06f398c145 up in Southbound
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.570 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a496b2e2-acd1-4c86-a868-b9e02022b552]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.599 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8a16f487-7f2d-4a9d-abc0-1418e01038be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.605 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6430bc-f554-4542-a51a-8e8ad79979ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 NetworkManager[51207]: <info>  [1759407565.6071] manager: (tapf19f283e-10): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.643 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc42051-88fb-4dae-bb31-5f7a8d4bdcc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.648 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3f5aef-1da5-4608-8d76-b0791c59150a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 NetworkManager[51207]: <info>  [1759407565.6745] device (tapf19f283e-10): carrier: link connected
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.680 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d156eb9f-bc1e-4f38-9c36-50b6a9d0c26b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.696 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4d41deb5-ef22-4474-85aa-3ec774c2dc83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19f283e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:f0:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 555928, 'reachable_time': 21824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233916, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.713 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[394c7649-5b1d-43cc-8b8a-fd5b150152b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:f006'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 555928, 'tstamp': 555928}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233917, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.731 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[21bd8914-2630-4bd4-aaef-1d1a4b3c891b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19f283e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:f0:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 555928, 'reachable_time': 21824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233919, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.762 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e25822ab-88ff-4366-8e6d-a6e81cb88ce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.822 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5f40456d-32ad-40b2-93bf-ec5cfa6c2b9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.824 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19f283e-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.824 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.825 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf19f283e-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 kernel: tapf19f283e-10: entered promiscuous mode
Oct  2 08:19:25 np0005466012 NetworkManager[51207]: <info>  [1759407565.8276] manager: (tapf19f283e-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.831 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf19f283e-10, col_values=(('external_ids', {'iface-id': '37cef8ba-697a-47b6-88bc-229da39d06cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:25Z|00359|binding|INFO|Releasing lport 37cef8ba-697a-47b6-88bc-229da39d06cc from this chassis (sb_readonly=0)
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.834 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f19f283e-171b-4ff9-a708-d3cae938c528.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f19f283e-171b-4ff9-a708-d3cae938c528.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.834 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fbcc2379-a545-4e05-9b32-38ad6f1eb0fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.837 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-f19f283e-171b-4ff9-a708-d3cae938c528
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/f19f283e-171b-4ff9-a708-d3cae938c528.pid.haproxy
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID f19f283e-171b-4ff9-a708-d3cae938c528
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:19:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:25.838 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'env', 'PROCESS_TAG=haproxy-f19f283e-171b-4ff9-a708-d3cae938c528', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f19f283e-171b-4ff9-a708-d3cae938c528.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.840 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.926 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.927 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.927 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:25 np0005466012 nova_compute[192063]: 2025-10-02 12:19:25.927 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.008 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.037 2 DEBUG nova.network.neutron [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Updated VIF entry in instance network info cache for port 28b49f60-75ed-4b04-86cf-6b06f398c145. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.038 2 DEBUG nova.network.neutron [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Updating instance_info_cache with network_info: [{"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.056 2 DEBUG oslo_concurrency.lockutils [req-4a8b8e2a-d33d-439f-9c0c-0ca600fe9bb7 req-06ab1dda-ece3-4024-ba1c-7657abbcedbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.112 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.114 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.181 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:26 np0005466012 podman[233962]: 2025-10-02 12:19:26.189241831 +0000 UTC m=+0.022058525 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:19:26 np0005466012 podman[233962]: 2025-10-02 12:19:26.292794995 +0000 UTC m=+0.125611669 container create 40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.302 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407566.3016515, 8e445940-a288-443c-868f-ae4f71577933 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.303 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] VM Started (Lifecycle Event)#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.330 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.332 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407566.304938, 8e445940-a288-443c-868f-ae4f71577933 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.332 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:19:26 np0005466012 systemd[1]: Started libpod-conmon-40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2.scope.
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.349 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.351 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.354 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.355 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5655MB free_disk=73.38704299926758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.355 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.355 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:26 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:19:26 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bab4e90b6da7af8edafc25cd0f5bda5c0acdad08a8f2025df2aea924a3d727aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.389 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:26 np0005466012 podman[233962]: 2025-10-02 12:19:26.400061306 +0000 UTC m=+0.232878000 container init 40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:19:26 np0005466012 podman[233962]: 2025-10-02 12:19:26.40580406 +0000 UTC m=+0.238620734 container start 40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:19:26 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [NOTICE]   (233984) : New worker (233986) forked
Oct  2 08:19:26 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [NOTICE]   (233984) : Loading success.
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.449 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 8e445940-a288-443c-868f-ae4f71577933 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.449 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.449 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.506 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.522 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.544 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:19:26 np0005466012 nova_compute[192063]: 2025-10-02 12:19:26.544 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:27 np0005466012 nova_compute[192063]: 2025-10-02 12:19:27.527 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.140 2 DEBUG nova.compute.manager [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.140 2 DEBUG oslo_concurrency.lockutils [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.141 2 DEBUG oslo_concurrency.lockutils [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.141 2 DEBUG oslo_concurrency.lockutils [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.142 2 DEBUG nova.compute.manager [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Processing event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.142 2 DEBUG nova.compute.manager [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.142 2 DEBUG oslo_concurrency.lockutils [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.143 2 DEBUG oslo_concurrency.lockutils [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.143 2 DEBUG oslo_concurrency.lockutils [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.143 2 DEBUG nova.compute.manager [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.144 2 WARNING nova.compute.manager [req-ed225559-e8b2-4070-ab2a-9b49a099666f req-be75f431-720c-48f2-8773-190e17b3a8f8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received unexpected event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.145 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.149 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407568.1496572, 8e445940-a288-443c-868f-ae4f71577933 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.150 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.152 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.156 2 INFO nova.virt.libvirt.driver [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance spawned successfully.#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.156 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.170 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.176 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.181 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.181 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.182 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.183 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.184 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.184 2 DEBUG nova.virt.libvirt.driver [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.215 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.255 2 INFO nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Took 9.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.256 2 DEBUG nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.351 2 INFO nova.compute.manager [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Took 9.78 seconds to build instance.#033[00m
Oct  2 08:19:28 np0005466012 nova_compute[192063]: 2025-10-02 12:19:28.376 2 DEBUG oslo_concurrency.lockutils [None req-177c13f4-e7e2-4add-84bb-96595b5fccf1 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.886s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.433 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.434 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.434 2 INFO nova.compute.manager [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Rebooting instance#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.450 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.450 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquired lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:19:29 np0005466012 nova_compute[192063]: 2025-10-02 12:19:29.450 2 DEBUG nova.network.neutron [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:19:30 np0005466012 podman[233996]: 2025-10-02 12:19:30.151503035 +0000 UTC m=+0.059065288 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd)
Oct  2 08:19:30 np0005466012 podman[233997]: 2025-10-02 12:19:30.164394625 +0000 UTC m=+0.066751488 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.397 2 DEBUG nova.network.neutron [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Updating instance_info_cache with network_info: [{"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.422 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Releasing lock "refresh_cache-8e445940-a288-443c-868f-ae4f71577933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.435 2 DEBUG nova.compute.manager [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 kernel: tap28b49f60-75 (unregistering): left promiscuous mode
Oct  2 08:19:31 np0005466012 NetworkManager[51207]: <info>  [1759407571.6053] device (tap28b49f60-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.606 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407556.602408, 95da5a4a-5301-4a2b-b135-01e08486477d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.606 2 INFO nova.compute.manager [-] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:31Z|00360|binding|INFO|Releasing lport 28b49f60-75ed-4b04-86cf-6b06f398c145 from this chassis (sb_readonly=0)
Oct  2 08:19:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:31Z|00361|binding|INFO|Setting lport 28b49f60-75ed-4b04-86cf-6b06f398c145 down in Southbound
Oct  2 08:19:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:31Z|00362|binding|INFO|Removing iface tap28b49f60-75 ovn-installed in OVS
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.630 2 DEBUG nova.compute.manager [None req-bf585f12-798b-4b2d-aca9-525dc73a9c6f - - - - - -] [instance: 95da5a4a-5301-4a2b-b135-01e08486477d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.630 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:ca:27 10.100.0.3'], port_security=['fa:16:3e:65:ca:27 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8e445940-a288-443c-868f-ae4f71577933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19f283e-171b-4ff9-a708-d3cae938c528', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11891f3ac4634e07b72041e075ad5323', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79e15997-ca89-439a-aee9-254a6fea676b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c00db673-c1c0-41ea-b853-0f498885fd63, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=28b49f60-75ed-4b04-86cf-6b06f398c145) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.632 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 28b49f60-75ed-4b04-86cf-6b06f398c145 in datapath f19f283e-171b-4ff9-a708-d3cae938c528 unbound from our chassis#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.633 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f19f283e-171b-4ff9-a708-d3cae938c528, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.634 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbaab8d-f14f-4a2d-beea-4d7365f71f1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.635 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 namespace which is not needed anymore#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000061.scope: Deactivated successfully.
Oct  2 08:19:31 np0005466012 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000061.scope: Consumed 4.131s CPU time.
Oct  2 08:19:31 np0005466012 systemd-machined[152114]: Machine qemu-42-instance-00000061 terminated.
Oct  2 08:19:31 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [NOTICE]   (233984) : haproxy version is 2.8.14-c23fe91
Oct  2 08:19:31 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [NOTICE]   (233984) : path to executable is /usr/sbin/haproxy
Oct  2 08:19:31 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [WARNING]  (233984) : Exiting Master process...
Oct  2 08:19:31 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [ALERT]    (233984) : Current worker (233986) exited with code 143 (Terminated)
Oct  2 08:19:31 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[233980]: [WARNING]  (233984) : All workers exited. Exiting... (0)
Oct  2 08:19:31 np0005466012 systemd[1]: libpod-40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2.scope: Deactivated successfully.
Oct  2 08:19:31 np0005466012 conmon[233980]: conmon 40d460b6ac95c346cb91 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2.scope/container/memory.events
Oct  2 08:19:31 np0005466012 podman[234058]: 2025-10-02 12:19:31.75254309 +0000 UTC m=+0.039800545 container died 40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:19:31 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2-userdata-shm.mount: Deactivated successfully.
Oct  2 08:19:31 np0005466012 systemd[1]: var-lib-containers-storage-overlay-bab4e90b6da7af8edafc25cd0f5bda5c0acdad08a8f2025df2aea924a3d727aa-merged.mount: Deactivated successfully.
Oct  2 08:19:31 np0005466012 podman[234058]: 2025-10-02 12:19:31.785421184 +0000 UTC m=+0.072678639 container cleanup 40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:19:31 np0005466012 systemd[1]: libpod-conmon-40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2.scope: Deactivated successfully.
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.851 2 INFO nova.virt.libvirt.driver [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance destroyed successfully.#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.852 2 DEBUG nova.objects.instance [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'resources' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:31 np0005466012 podman[234091]: 2025-10-02 12:19:31.866339138 +0000 UTC m=+0.046415503 container remove 40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.870 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf43d43-b41c-4b39-8deb-c05977ba9bbf]: (4, ('Thu Oct  2 12:19:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 (40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2)\n40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2\nThu Oct  2 12:19:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 (40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2)\n40d460b6ac95c346cb911abcff86b4eb97be79e8c85837fafa46c45707df40b2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.872 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[069f18f3-032d-470a-b12e-7c001c7a7853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.873 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19f283e-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 kernel: tapf19f283e-10: left promiscuous mode
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.895 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca77cbf-86c4-45c9-89b6-3f0e9b62c747]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.919 2 DEBUG nova.virt.libvirt.vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1806936658',display_name='tempest-InstanceActionsTestJSON-server-1806936658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1806936658',id=97,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11891f3ac4634e07b72041e075ad5323',ramdisk_id='',reservation_id='r-4l83lw6w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-816248192',owner_user_name='tempest-InstanceActionsTestJSON-816248192-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:31Z,user_data=None,user_id='b6d5b74212e6414eaaf46792bfc0310b',uuid=8e445940-a288-443c-868f-ae4f71577933,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.919 2 DEBUG nova.network.os_vif_util [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converting VIF {"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.920 2 DEBUG nova.network.os_vif_util [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.920 2 DEBUG os_vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.922 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28b49f60-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.927 2 INFO os_vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75')#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.930 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d818e4fa-fb0b-4f42-831c-033f70d97391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.931 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6283ee8f-410b-4139-b64b-b2fc54b7c091]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.934 2 DEBUG nova.virt.libvirt.driver [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Start _get_guest_xml network_info=[{"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.937 2 DEBUG nova.compute.manager [req-63c8c553-dda1-4176-afde-b8af03706fb4 req-7718cbef-36a1-4daa-b1d4-09a699f2cf95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-unplugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.937 2 DEBUG oslo_concurrency.lockutils [req-63c8c553-dda1-4176-afde-b8af03706fb4 req-7718cbef-36a1-4daa-b1d4-09a699f2cf95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.937 2 DEBUG oslo_concurrency.lockutils [req-63c8c553-dda1-4176-afde-b8af03706fb4 req-7718cbef-36a1-4daa-b1d4-09a699f2cf95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.937 2 DEBUG oslo_concurrency.lockutils [req-63c8c553-dda1-4176-afde-b8af03706fb4 req-7718cbef-36a1-4daa-b1d4-09a699f2cf95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.938 2 DEBUG nova.compute.manager [req-63c8c553-dda1-4176-afde-b8af03706fb4 req-7718cbef-36a1-4daa-b1d4-09a699f2cf95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-unplugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.938 2 WARNING nova.compute.manager [req-63c8c553-dda1-4176-afde-b8af03706fb4 req-7718cbef-36a1-4daa-b1d4-09a699f2cf95 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received unexpected event network-vif-unplugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.941 2 WARNING nova.virt.libvirt.driver [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.945 2 DEBUG nova.virt.libvirt.host [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.945 2 DEBUG nova.virt.libvirt.host [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.946 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a688eba2-cf56-49ec-82d8-a54d5b7c06de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 555920, 'reachable_time': 24407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234116, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.948 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:19:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:31.949 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[715607e9-7f4a-414e-a3ce-8104692d1b9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:31 np0005466012 systemd[1]: run-netns-ovnmeta\x2df19f283e\x2d171b\x2d4ff9\x2da708\x2dd3cae938c528.mount: Deactivated successfully.
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.950 2 DEBUG nova.virt.libvirt.host [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.951 2 DEBUG nova.virt.libvirt.host [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.952 2 DEBUG nova.virt.libvirt.driver [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.952 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.953 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.953 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.953 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.953 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.953 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.954 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.954 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.954 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.955 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.955 2 DEBUG nova.virt.hardware [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.955 2 DEBUG nova.objects.instance [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:31 np0005466012 nova_compute[192063]: 2025-10-02 12:19:31.970 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.032 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.034 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.034 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.035 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.036 2 DEBUG nova.virt.libvirt.vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1806936658',display_name='tempest-InstanceActionsTestJSON-server-1806936658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1806936658',id=97,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11891f3ac4634e07b72041e075ad5323',ramdisk_id='',reservation_id='r-4l83lw6w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-816248192',owner_user_name='tempest-InstanceActionsTestJSON-816248192-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:31Z,user_data=None,user_id='b6d5b74212e6414eaaf46792bfc0310b',uuid=8e445940-a288-443c-868f-ae4f71577933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.037 2 DEBUG nova.network.os_vif_util [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converting VIF {"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.038 2 DEBUG nova.network.os_vif_util [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.039 2 DEBUG nova.objects.instance [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.061 2 DEBUG nova.virt.libvirt.driver [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <uuid>8e445940-a288-443c-868f-ae4f71577933</uuid>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <name>instance-00000061</name>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:name>tempest-InstanceActionsTestJSON-server-1806936658</nova:name>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:19:31</nova:creationTime>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:user uuid="b6d5b74212e6414eaaf46792bfc0310b">tempest-InstanceActionsTestJSON-816248192-project-member</nova:user>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:project uuid="11891f3ac4634e07b72041e075ad5323">tempest-InstanceActionsTestJSON-816248192</nova:project>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        <nova:port uuid="28b49f60-75ed-4b04-86cf-6b06f398c145">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <entry name="serial">8e445940-a288-443c-868f-ae4f71577933</entry>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <entry name="uuid">8e445940-a288-443c-868f-ae4f71577933</entry>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk.config"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:65:ca:27"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <target dev="tap28b49f60-75"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/console.log" append="off"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:19:32 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:19:32 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:19:32 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:19:32 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.062 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.121 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.123 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.177 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.180 2 DEBUG nova.objects.instance [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.197 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.251 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.252 2 DEBUG nova.virt.disk.api [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Checking if we can resize image /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.253 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.308 2 DEBUG oslo_concurrency.processutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.309 2 DEBUG nova.virt.disk.api [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Cannot resize image /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.309 2 DEBUG nova.objects.instance [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'migration_context' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.339 2 DEBUG nova.virt.libvirt.vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1806936658',display_name='tempest-InstanceActionsTestJSON-server-1806936658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1806936658',id=97,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='11891f3ac4634e07b72041e075ad5323',ramdisk_id='',reservation_id='r-4l83lw6w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-816248192',owner_user_name='tempest-InstanceActionsTestJSON-816248192-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:19:31Z,user_data=None,user_id='b6d5b74212e6414eaaf46792bfc0310b',uuid=8e445940-a288-443c-868f-ae4f71577933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.340 2 DEBUG nova.network.os_vif_util [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converting VIF {"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.341 2 DEBUG nova.network.os_vif_util [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.342 2 DEBUG os_vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.343 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.343 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.346 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28b49f60-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.347 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap28b49f60-75, col_values=(('external_ids', {'iface-id': '28b49f60-75ed-4b04-86cf-6b06f398c145', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:ca:27', 'vm-uuid': '8e445940-a288-443c-868f-ae4f71577933'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.3500] manager: (tap28b49f60-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.354 2 INFO os_vif [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75')#033[00m
Oct  2 08:19:32 np0005466012 kernel: tap28b49f60-75: entered promiscuous mode
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.4302] manager: (tap28b49f60-75): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Oct  2 08:19:32 np0005466012 systemd-udevd[234038]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:32Z|00363|binding|INFO|Claiming lport 28b49f60-75ed-4b04-86cf-6b06f398c145 for this chassis.
Oct  2 08:19:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:32Z|00364|binding|INFO|28b49f60-75ed-4b04-86cf-6b06f398c145: Claiming fa:16:3e:65:ca:27 10.100.0.3
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.4425] device (tap28b49f60-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.4436] device (tap28b49f60-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:19:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:32Z|00365|binding|INFO|Setting lport 28b49f60-75ed-4b04-86cf-6b06f398c145 ovn-installed in OVS
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:32Z|00366|binding|INFO|Setting lport 28b49f60-75ed-4b04-86cf-6b06f398c145 up in Southbound
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.448 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:ca:27 10.100.0.3'], port_security=['fa:16:3e:65:ca:27 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8e445940-a288-443c-868f-ae4f71577933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19f283e-171b-4ff9-a708-d3cae938c528', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11891f3ac4634e07b72041e075ad5323', 'neutron:revision_number': '5', 'neutron:security_group_ids': '79e15997-ca89-439a-aee9-254a6fea676b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c00db673-c1c0-41ea-b853-0f498885fd63, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=28b49f60-75ed-4b04-86cf-6b06f398c145) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.449 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 28b49f60-75ed-4b04-86cf-6b06f398c145 in datapath f19f283e-171b-4ff9-a708-d3cae938c528 bound to our chassis#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.450 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f19f283e-171b-4ff9-a708-d3cae938c528#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.461 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b45252a0-330a-4d61-9b27-86f6fbe262b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.462 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf19f283e-11 in ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.463 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf19f283e-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.463 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb61077b-dcb4-4f88-b5b8-84dea5e40edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 systemd-machined[152114]: New machine qemu-43-instance-00000061.
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.464 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c226fa5f-ee60-4d06-9814-f5f1cc5ebdbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.479 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc32bca-4d82-4fd6-b5a5-82dd647dadfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 systemd[1]: Started Virtual Machine qemu-43-instance-00000061.
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.502 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aa62ca11-ac12-41b1-aed1-4d327ef607cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.529 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[562f1f22-4dfe-4d95-819f-8bef8981e192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.533 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e3bfab2f-8b83-4d34-80d8-039b347e0789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.5347] manager: (tapf19f283e-10): new Veth device (/org/freedesktop/NetworkManager/Devices/165)
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.570 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[145086da-d86b-48f2-9dd2-f7e6d4ba0799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.573 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7e09e119-4e15-4e37-a07b-ca0d71dbe494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.5921] device (tapf19f283e-10): carrier: link connected
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.596 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[056ce412-8420-4aad-8161-050d983485b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.613 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0959d2ba-73f7-475e-af6f-c9408ab3f8fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19f283e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:f0:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556620, 'reachable_time': 20913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234179, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.628 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9eeadec4-6567-4f16-ac5b-e141889f0f1a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:f006'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556620, 'tstamp': 556620}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234180, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.644 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[19dddc89-3abb-4762-abfa-2ec89926e44f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19f283e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:f0:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556620, 'reachable_time': 20913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234181, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.675 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[50f013d3-4ddd-4feb-8317-11f76bf66684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.733 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ea6cd5-010e-4d26-bfd7-9040f89499fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.735 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19f283e-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.735 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.735 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf19f283e-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 NetworkManager[51207]: <info>  [1759407572.7396] manager: (tapf19f283e-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Oct  2 08:19:32 np0005466012 kernel: tapf19f283e-10: entered promiscuous mode
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.741 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf19f283e-10, col_values=(('external_ids', {'iface-id': '37cef8ba-697a-47b6-88bc-229da39d06cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:32Z|00367|binding|INFO|Releasing lport 37cef8ba-697a-47b6-88bc-229da39d06cc from this chassis (sb_readonly=0)
Oct  2 08:19:32 np0005466012 nova_compute[192063]: 2025-10-02 12:19:32.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.754 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f19f283e-171b-4ff9-a708-d3cae938c528.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f19f283e-171b-4ff9-a708-d3cae938c528.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.755 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[51b689ab-4896-4cea-b6d1-b2898fcb77bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.755 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-f19f283e-171b-4ff9-a708-d3cae938c528
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/f19f283e-171b-4ff9-a708-d3cae938c528.pid.haproxy
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID f19f283e-171b-4ff9-a708-d3cae938c528
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:19:32 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:32.756 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'env', 'PROCESS_TAG=haproxy-f19f283e-171b-4ff9-a708-d3cae938c528', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f19f283e-171b-4ff9-a708-d3cae938c528.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:19:33 np0005466012 podman[234213]: 2025-10-02 12:19:33.135791169 +0000 UTC m=+0.053246340 container create 27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:19:33 np0005466012 systemd[1]: Started libpod-conmon-27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec.scope.
Oct  2 08:19:33 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:19:33 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e54ca900a6c47dd98419f540a0616aeaf13de62ce522335fd93671f3fc5e8d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:19:33 np0005466012 podman[234213]: 2025-10-02 12:19:33.110319788 +0000 UTC m=+0.027774969 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:19:33 np0005466012 podman[234213]: 2025-10-02 12:19:33.214380346 +0000 UTC m=+0.131835527 container init 27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:19:33 np0005466012 podman[234213]: 2025-10-02 12:19:33.22074146 +0000 UTC m=+0.138196621 container start 27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:19:33 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [NOTICE]   (234232) : New worker (234234) forked
Oct  2 08:19:33 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [NOTICE]   (234232) : Loading success.
Oct  2 08:19:33 np0005466012 nova_compute[192063]: 2025-10-02 12:19:33.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:33 np0005466012 nova_compute[192063]: 2025-10-02 12:19:33.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:19:33 np0005466012 nova_compute[192063]: 2025-10-02 12:19:33.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.112 2 DEBUG nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.112 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.112 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.113 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.113 2 DEBUG nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.113 2 WARNING nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received unexpected event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.113 2 DEBUG nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.113 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.113 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.114 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.114 2 DEBUG nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.114 2 WARNING nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received unexpected event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.114 2 DEBUG nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.114 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.114 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.115 2 DEBUG oslo_concurrency.lockutils [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.115 2 DEBUG nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.115 2 WARNING nova.compute.manager [req-b75e9698-ad9f-4e29-8e0f-aa06e1e0cdfa req-dcf9dbc5-4fab-4f5e-9fae-cc8c55a8ac10 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received unexpected event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.527 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 8e445940-a288-443c-868f-ae4f71577933 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.528 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407574.5268974, 8e445940-a288-443c-868f-ae4f71577933 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.528 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.532 2 DEBUG nova.compute.manager [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.537 2 INFO nova.virt.libvirt.driver [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance rebooted successfully.#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.538 2 DEBUG nova.compute.manager [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.548 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.552 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.587 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.588 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407574.5285957, 8e445940-a288-443c-868f-ae4f71577933 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.588 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] VM Started (Lifecycle Event)#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.613 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.615 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:34 np0005466012 nova_compute[192063]: 2025-10-02 12:19:34.639 2 DEBUG oslo_concurrency.lockutils [None req-7f57ea43-99be-42e6-9f0f-f8aa1e99d174 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:35 np0005466012 podman[234251]: 2025-10-02 12:19:35.144435632 +0000 UTC m=+0.052460327 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:19:35 np0005466012 podman[234250]: 2025-10-02 12:19:35.157406315 +0000 UTC m=+0.065954225 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.805 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.805 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.806 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.806 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.807 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.818 2 INFO nova.compute.manager [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Terminating instance#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.831 2 DEBUG nova.compute.manager [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.843 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:19:35 np0005466012 kernel: tap28b49f60-75 (unregistering): left promiscuous mode
Oct  2 08:19:35 np0005466012 NetworkManager[51207]: <info>  [1759407575.8521] device (tap28b49f60-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.904 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.904 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:19:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:35Z|00368|binding|INFO|Releasing lport 28b49f60-75ed-4b04-86cf-6b06f398c145 from this chassis (sb_readonly=0)
Oct  2 08:19:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:35Z|00369|binding|INFO|Setting lport 28b49f60-75ed-4b04-86cf-6b06f398c145 down in Southbound
Oct  2 08:19:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:19:35Z|00370|binding|INFO|Removing iface tap28b49f60-75 ovn-installed in OVS
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:35.914 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:ca:27 10.100.0.3'], port_security=['fa:16:3e:65:ca:27 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8e445940-a288-443c-868f-ae4f71577933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19f283e-171b-4ff9-a708-d3cae938c528', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11891f3ac4634e07b72041e075ad5323', 'neutron:revision_number': '6', 'neutron:security_group_ids': '79e15997-ca89-439a-aee9-254a6fea676b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c00db673-c1c0-41ea-b853-0f498885fd63, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=28b49f60-75ed-4b04-86cf-6b06f398c145) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:35.917 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 28b49f60-75ed-4b04-86cf-6b06f398c145 in datapath f19f283e-171b-4ff9-a708-d3cae938c528 unbound from our chassis#033[00m
Oct  2 08:19:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:35.919 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f19f283e-171b-4ff9-a708-d3cae938c528, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:19:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:35.921 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aedf74f9-bc01-4db1-a5f0-96d9d4ebdeb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:35.921 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 namespace which is not needed anymore#033[00m
Oct  2 08:19:35 np0005466012 nova_compute[192063]: 2025-10-02 12:19:35.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:35 np0005466012 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000061.scope: Deactivated successfully.
Oct  2 08:19:35 np0005466012 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000061.scope: Consumed 3.354s CPU time.
Oct  2 08:19:35 np0005466012 systemd-machined[152114]: Machine qemu-43-instance-00000061 terminated.
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.106 2 INFO nova.virt.libvirt.driver [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Instance destroyed successfully.#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.107 2 DEBUG nova.objects.instance [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lazy-loading 'resources' on Instance uuid 8e445940-a288-443c-868f-ae4f71577933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:36 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [NOTICE]   (234232) : haproxy version is 2.8.14-c23fe91
Oct  2 08:19:36 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [NOTICE]   (234232) : path to executable is /usr/sbin/haproxy
Oct  2 08:19:36 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [WARNING]  (234232) : Exiting Master process...
Oct  2 08:19:36 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [WARNING]  (234232) : Exiting Master process...
Oct  2 08:19:36 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [ALERT]    (234232) : Current worker (234234) exited with code 143 (Terminated)
Oct  2 08:19:36 np0005466012 neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528[234228]: [WARNING]  (234232) : All workers exited. Exiting... (0)
Oct  2 08:19:36 np0005466012 systemd[1]: libpod-27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec.scope: Deactivated successfully.
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.138 2 DEBUG nova.virt.libvirt.vif [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1806936658',display_name='tempest-InstanceActionsTestJSON-server-1806936658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1806936658',id=97,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:19:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11891f3ac4634e07b72041e075ad5323',ramdisk_id='',reservation_id='r-4l83lw6w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-816248192',owner_user_name='tempest-InstanceActionsTestJSON-816248192-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:19:34Z,user_data=None,user_id='b6d5b74212e6414eaaf46792bfc0310b',uuid=8e445940-a288-443c-868f-ae4f71577933,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.139 2 DEBUG nova.network.os_vif_util [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converting VIF {"id": "28b49f60-75ed-4b04-86cf-6b06f398c145", "address": "fa:16:3e:65:ca:27", "network": {"id": "f19f283e-171b-4ff9-a708-d3cae938c528", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1912510715-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11891f3ac4634e07b72041e075ad5323", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b49f60-75", "ovs_interfaceid": "28b49f60-75ed-4b04-86cf-6b06f398c145", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.139 2 DEBUG nova.network.os_vif_util [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.140 2 DEBUG os_vif [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.142 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28b49f60-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:36 np0005466012 podman[234314]: 2025-10-02 12:19:36.143034534 +0000 UTC m=+0.113976465 container died 27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.148 2 INFO os_vif [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ca:27,bridge_name='br-int',has_traffic_filtering=True,id=28b49f60-75ed-4b04-86cf-6b06f398c145,network=Network(f19f283e-171b-4ff9-a708-d3cae938c528),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b49f60-75')#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.148 2 INFO nova.virt.libvirt.driver [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Deleting instance files /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933_del#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.149 2 INFO nova.virt.libvirt.driver [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Deletion of /var/lib/nova/instances/8e445940-a288-443c-868f-ae4f71577933_del complete#033[00m
Oct  2 08:19:36 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec-userdata-shm.mount: Deactivated successfully.
Oct  2 08:19:36 np0005466012 systemd[1]: var-lib-containers-storage-overlay-9e54ca900a6c47dd98419f540a0616aeaf13de62ce522335fd93671f3fc5e8d3-merged.mount: Deactivated successfully.
Oct  2 08:19:36 np0005466012 podman[234314]: 2025-10-02 12:19:36.189651133 +0000 UTC m=+0.160593064 container cleanup 27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:19:36 np0005466012 systemd[1]: libpod-conmon-27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec.scope: Deactivated successfully.
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.252 2 INFO nova.compute.manager [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.253 2 DEBUG oslo.service.loopingcall [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.253 2 DEBUG nova.compute.manager [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.253 2 DEBUG nova.network.neutron [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.257 2 DEBUG nova.compute.manager [req-5d935c8e-eaaf-4b61-8b2a-6cd085c7f016 req-07146b22-683e-4629-bb07-c2de325e47d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-unplugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.257 2 DEBUG oslo_concurrency.lockutils [req-5d935c8e-eaaf-4b61-8b2a-6cd085c7f016 req-07146b22-683e-4629-bb07-c2de325e47d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.258 2 DEBUG oslo_concurrency.lockutils [req-5d935c8e-eaaf-4b61-8b2a-6cd085c7f016 req-07146b22-683e-4629-bb07-c2de325e47d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.258 2 DEBUG oslo_concurrency.lockutils [req-5d935c8e-eaaf-4b61-8b2a-6cd085c7f016 req-07146b22-683e-4629-bb07-c2de325e47d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.258 2 DEBUG nova.compute.manager [req-5d935c8e-eaaf-4b61-8b2a-6cd085c7f016 req-07146b22-683e-4629-bb07-c2de325e47d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-unplugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.258 2 DEBUG nova.compute.manager [req-5d935c8e-eaaf-4b61-8b2a-6cd085c7f016 req-07146b22-683e-4629-bb07-c2de325e47d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-unplugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:19:36 np0005466012 podman[234357]: 2025-10-02 12:19:36.404429161 +0000 UTC m=+0.194258160 container remove 27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.412 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[602a8462-7be8-4ac6-a512-d0d73a0b5530]: (4, ('Thu Oct  2 12:19:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 (27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec)\n27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec\nThu Oct  2 12:19:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 (27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec)\n27cd2991d025e76cd1f36ddd756d622b6c4ec7a4b838404b9393f4a3287cb1ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.414 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1adb7238-92f8-4f9e-8664-958d8846ed50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.416 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19f283e-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:36 np0005466012 kernel: tapf19f283e-10: left promiscuous mode
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.423 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1ffdd31-97e8-4b31-a005-acbdcdc57478]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.450 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2edc8247-a2fd-4b4a-9578-973bb10b5486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.451 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e7f884-1637-4bed-b0b1-e4d0ee197653]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.470 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[48932506-f50a-4142-90be-38cf1bb618d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556613, 'reachable_time': 40703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234372, 'error': None, 'target': 'ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.472 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f19f283e-171b-4ff9-a708-d3cae938c528 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:19:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:36.473 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e74a67-9af0-4fab-8574-6a8abd95155f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:19:36 np0005466012 systemd[1]: run-netns-ovnmeta\x2df19f283e\x2d171b\x2d4ff9\x2da708\x2dd3cae938c528.mount: Deactivated successfully.
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.898 2 DEBUG nova.network.neutron [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:19:36 np0005466012 nova_compute[192063]: 2025-10-02 12:19:36.928 2 INFO nova.compute.manager [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] Took 0.67 seconds to deallocate network for instance.#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.042 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.043 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.137 2 DEBUG nova.compute.provider_tree [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.150 2 DEBUG nova.compute.manager [req-af2e4c4d-c4e3-4e69-9782-52706dd7f6ec req-e3d932b3-b89f-4bcb-a5b9-5c3822550d9f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-deleted-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.173 2 DEBUG nova.scheduler.client.report [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.208 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.262 2 INFO nova.scheduler.client.report [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Deleted allocations for instance 8e445940-a288-443c-868f-ae4f71577933#033[00m
Oct  2 08:19:37 np0005466012 nova_compute[192063]: 2025-10-02 12:19:37.363 2 DEBUG oslo_concurrency.lockutils [None req-9c190973-043c-454d-a613-6139f93d3533 b6d5b74212e6414eaaf46792bfc0310b 11891f3ac4634e07b72041e075ad5323 - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:38 np0005466012 nova_compute[192063]: 2025-10-02 12:19:38.502 2 DEBUG nova.compute.manager [req-4a20bf8c-6628-4054-855e-b85fe7691d3c req-42352168-6558-4410-a475-c7abe223f1fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:19:38 np0005466012 nova_compute[192063]: 2025-10-02 12:19:38.503 2 DEBUG oslo_concurrency.lockutils [req-4a20bf8c-6628-4054-855e-b85fe7691d3c req-42352168-6558-4410-a475-c7abe223f1fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "8e445940-a288-443c-868f-ae4f71577933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:38 np0005466012 nova_compute[192063]: 2025-10-02 12:19:38.503 2 DEBUG oslo_concurrency.lockutils [req-4a20bf8c-6628-4054-855e-b85fe7691d3c req-42352168-6558-4410-a475-c7abe223f1fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:38 np0005466012 nova_compute[192063]: 2025-10-02 12:19:38.504 2 DEBUG oslo_concurrency.lockutils [req-4a20bf8c-6628-4054-855e-b85fe7691d3c req-42352168-6558-4410-a475-c7abe223f1fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "8e445940-a288-443c-868f-ae4f71577933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:38 np0005466012 nova_compute[192063]: 2025-10-02 12:19:38.504 2 DEBUG nova.compute.manager [req-4a20bf8c-6628-4054-855e-b85fe7691d3c req-42352168-6558-4410-a475-c7abe223f1fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] No waiting events found dispatching network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:19:38 np0005466012 nova_compute[192063]: 2025-10-02 12:19:38.505 2 WARNING nova.compute.manager [req-4a20bf8c-6628-4054-855e-b85fe7691d3c req-42352168-6558-4410-a475-c7abe223f1fc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 8e445940-a288-443c-868f-ae4f71577933] Received unexpected event network-vif-plugged-28b49f60-75ed-4b04-86cf-6b06f398c145 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:19:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:39.155 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:19:39 np0005466012 nova_compute[192063]: 2025-10-02 12:19:39.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:39.157 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:19:40 np0005466012 nova_compute[192063]: 2025-10-02 12:19:40.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:19:40.159 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:19:41 np0005466012 nova_compute[192063]: 2025-10-02 12:19:41.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:41 np0005466012 nova_compute[192063]: 2025-10-02 12:19:41.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:42 np0005466012 nova_compute[192063]: 2025-10-02 12:19:42.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:19:46 np0005466012 nova_compute[192063]: 2025-10-02 12:19:46.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:46 np0005466012 nova_compute[192063]: 2025-10-02 12:19:46.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:49 np0005466012 podman[234374]: 2025-10-02 12:19:49.140033924 +0000 UTC m=+0.050931993 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:19:49 np0005466012 podman[234375]: 2025-10-02 12:19:49.222549565 +0000 UTC m=+0.128822181 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:19:51 np0005466012 nova_compute[192063]: 2025-10-02 12:19:51.103 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407576.101432, 8e445940-a288-443c-868f-ae4f71577933 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:51 np0005466012 nova_compute[192063]: 2025-10-02 12:19:51.103 2 INFO nova.compute.manager [-] [instance: 8e445940-a288-443c-868f-ae4f71577933] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:19:51 np0005466012 nova_compute[192063]: 2025-10-02 12:19:51.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:51 np0005466012 nova_compute[192063]: 2025-10-02 12:19:51.219 2 DEBUG nova.compute.manager [None req-e9b53b43-e24b-405b-a446-5f760685b08c - - - - - -] [instance: 8e445940-a288-443c-868f-ae4f71577933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:51 np0005466012 nova_compute[192063]: 2025-10-02 12:19:51.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:52 np0005466012 podman[234424]: 2025-10-02 12:19:52.159412017 +0000 UTC m=+0.067388237 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:19:52 np0005466012 nova_compute[192063]: 2025-10-02 12:19:52.811 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "a782659d-957b-4570-b4ff-74461541a3ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:52 np0005466012 nova_compute[192063]: 2025-10-02 12:19:52.811 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "a782659d-957b-4570-b4ff-74461541a3ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:52 np0005466012 nova_compute[192063]: 2025-10-02 12:19:52.845 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.006 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.007 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.013 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.013 2 INFO nova.compute.claims [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.152 2 DEBUG nova.compute.provider_tree [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.167 2 DEBUG nova.scheduler.client.report [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.203 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.204 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.283 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.319 2 INFO nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.377 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.620 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.621 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.622 2 INFO nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Creating image(s)#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.622 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.622 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.623 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.634 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.724 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.726 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.727 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.751 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.827 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.828 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.866 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.867 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.867 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.956 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.957 2 DEBUG nova.virt.disk.api [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Checking if we can resize image /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:19:53 np0005466012 nova_compute[192063]: 2025-10-02 12:19:53.957 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.024 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.026 2 DEBUG nova.virt.disk.api [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Cannot resize image /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.027 2 DEBUG nova.objects.instance [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'migration_context' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.042 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.042 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Ensure instance console log exists: /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.043 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.044 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.044 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.047 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.053 2 WARNING nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.061 2 DEBUG nova.virt.libvirt.host [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.062 2 DEBUG nova.virt.libvirt.host [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.066 2 DEBUG nova.virt.libvirt.host [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.067 2 DEBUG nova.virt.libvirt.host [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.069 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.069 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.070 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.071 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.071 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.072 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.072 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.073 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.073 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.074 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.074 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.074 2 DEBUG nova.virt.hardware [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.081 2 DEBUG nova.objects.instance [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'pci_devices' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.095 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <uuid>a782659d-957b-4570-b4ff-74461541a3ff</uuid>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <name>instance-00000066</name>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerShowV247Test-server-94123705</nova:name>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:19:54</nova:creationTime>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:user uuid="e7efd391ff484c8bb99570302eacb8f4">tempest-ServerShowV247Test-1215164495-project-member</nova:user>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:        <nova:project uuid="12599487474040b285ccdd017a8c01b5">tempest-ServerShowV247Test-1215164495</nova:project>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <entry name="serial">a782659d-957b-4570-b4ff-74461541a3ff</entry>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <entry name="uuid">a782659d-957b-4570-b4ff-74461541a3ff</entry>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/console.log" append="off"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:19:54 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:19:54 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:19:54 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:19:54 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.156 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.156 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.156 2 INFO nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Using config drive#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.337 2 INFO nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Creating config drive at /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.342 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0utb1nt9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:19:54 np0005466012 nova_compute[192063]: 2025-10-02 12:19:54.480 2 DEBUG oslo_concurrency.processutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0utb1nt9" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:19:54 np0005466012 systemd-machined[152114]: New machine qemu-44-instance-00000066.
Oct  2 08:19:54 np0005466012 systemd[1]: Started Virtual Machine qemu-44-instance-00000066.
Oct  2 08:19:54 np0005466012 podman[234469]: 2025-10-02 12:19:54.644130623 +0000 UTC m=+0.082086938 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3)
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.280 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407595.2799575, a782659d-957b-4570-b4ff-74461541a3ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.281 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.284 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.284 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.287 2 INFO nova.virt.libvirt.driver [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance spawned successfully.#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.287 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.310 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.313 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.314 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.314 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.314 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.315 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.315 2 DEBUG nova.virt.libvirt.driver [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.319 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.352 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.352 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407595.2831876, a782659d-957b-4570-b4ff-74461541a3ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.352 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] VM Started (Lifecycle Event)#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.377 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.381 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.423 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.452 2 INFO nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Took 1.83 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.452 2 DEBUG nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.559 2 INFO nova.compute.manager [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Took 2.62 seconds to build instance.#033[00m
Oct  2 08:19:55 np0005466012 nova_compute[192063]: 2025-10-02 12:19:55.604 2 DEBUG oslo_concurrency.lockutils [None req-dd28b336-c354-4aa4-866b-d4102873c005 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "a782659d-957b-4570-b4ff-74461541a3ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:19:56 np0005466012 nova_compute[192063]: 2025-10-02 12:19:56.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:56 np0005466012 nova_compute[192063]: 2025-10-02 12:19:56.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.131 2 INFO nova.compute.manager [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Rebuilding instance#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.493 2 DEBUG nova.compute.manager [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.579 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'pci_requests' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.596 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'pci_devices' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.611 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'resources' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.630 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'migration_context' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.644 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:19:57 np0005466012 nova_compute[192063]: 2025-10-02 12:19:57.647 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:20:01 np0005466012 podman[234506]: 2025-10-02 12:20:01.147128274 +0000 UTC m=+0.061702995 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  2 08:20:01 np0005466012 podman[234507]: 2025-10-02 12:20:01.147834934 +0000 UTC m=+0.059125600 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7)
Oct  2 08:20:01 np0005466012 nova_compute[192063]: 2025-10-02 12:20:01.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:01 np0005466012 nova_compute[192063]: 2025-10-02 12:20:01.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:02.130 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:02.130 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:02.131 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.691 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.691 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.718 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.825 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.826 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.833 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.833 2 INFO nova.compute.claims [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.953 2 DEBUG nova.compute.provider_tree [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:02 np0005466012 nova_compute[192063]: 2025-10-02 12:20:02.973 2 DEBUG nova.scheduler.client.report [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.000 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.001 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.054 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.055 2 DEBUG nova.network.neutron [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.081 2 INFO nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.101 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.196 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.197 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.198 2 INFO nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Creating image(s)#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.199 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "/var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.199 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "/var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.200 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "/var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.217 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.281 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.283 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.284 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.301 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.324 2 DEBUG nova.policy [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a803afe9939346088252c3b944f124f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.360 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.361 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.399 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.401 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.401 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.459 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.461 2 DEBUG nova.virt.disk.api [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Checking if we can resize image /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.461 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.522 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.524 2 DEBUG nova.virt.disk.api [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Cannot resize image /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.525 2 DEBUG nova.objects.instance [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'migration_context' on Instance uuid 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.538 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.539 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Ensure instance console log exists: /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.539 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.540 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:03 np0005466012 nova_compute[192063]: 2025-10-02 12:20:03.540 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:04 np0005466012 nova_compute[192063]: 2025-10-02 12:20:04.512 2 DEBUG nova.network.neutron [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Successfully created port: dbc66205-3b88-45f4-93d3-e55042c4e27a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.035 2 DEBUG nova.network.neutron [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Successfully updated port: dbc66205-3b88-45f4-93d3-e55042c4e27a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.054 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "refresh_cache-84dd40d6-c9ef-4126-9f71-24a269d9f5f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.054 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquired lock "refresh_cache-84dd40d6-c9ef-4126-9f71-24a269d9f5f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.055 2 DEBUG nova.network.neutron [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:06 np0005466012 podman[234566]: 2025-10-02 12:20:06.145388094 +0000 UTC m=+0.050147612 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:20:06 np0005466012 podman[234565]: 2025-10-02 12:20:06.146849455 +0000 UTC m=+0.054743022 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.151 2 DEBUG nova.compute.manager [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-changed-dbc66205-3b88-45f4-93d3-e55042c4e27a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.152 2 DEBUG nova.compute.manager [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Refreshing instance network info cache due to event network-changed-dbc66205-3b88-45f4-93d3-e55042c4e27a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.152 2 DEBUG oslo_concurrency.lockutils [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-84dd40d6-c9ef-4126-9f71-24a269d9f5f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.222 2 DEBUG nova.network.neutron [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:06 np0005466012 nova_compute[192063]: 2025-10-02 12:20:06.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.198 2 DEBUG nova.network.neutron [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Updating instance_info_cache with network_info: [{"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.222 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Releasing lock "refresh_cache-84dd40d6-c9ef-4126-9f71-24a269d9f5f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.222 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Instance network_info: |[{"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.222 2 DEBUG oslo_concurrency.lockutils [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-84dd40d6-c9ef-4126-9f71-24a269d9f5f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.223 2 DEBUG nova.network.neutron [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Refreshing network info cache for port dbc66205-3b88-45f4-93d3-e55042c4e27a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.226 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Start _get_guest_xml network_info=[{"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.233 2 WARNING nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.237 2 DEBUG nova.virt.libvirt.host [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.237 2 DEBUG nova.virt.libvirt.host [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.243 2 DEBUG nova.virt.libvirt.host [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.243 2 DEBUG nova.virt.libvirt.host [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.244 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.244 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.245 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.245 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.245 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.245 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.246 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.246 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.246 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.246 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.247 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.247 2 DEBUG nova.virt.hardware [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.250 2 DEBUG nova.virt.libvirt.vif [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-738753080',display_name='tempest-ServersNegativeTestJSON-server-738753080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-738753080',id=105,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-5s1z3elh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:03Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=84dd40d6-c9ef-4126-9f71-24a269d9f5f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.250 2 DEBUG nova.network.os_vif_util [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.251 2 DEBUG nova.network.os_vif_util [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.251 2 DEBUG nova.objects.instance [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.267 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <uuid>84dd40d6-c9ef-4126-9f71-24a269d9f5f3</uuid>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <name>instance-00000069</name>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersNegativeTestJSON-server-738753080</nova:name>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:20:07</nova:creationTime>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:user uuid="a803afe9939346088252c3b944f124f2">tempest-ServersNegativeTestJSON-114354241-project-member</nova:user>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:project uuid="f0c8c8a8631b4721beed577a99f8bdb7">tempest-ServersNegativeTestJSON-114354241</nova:project>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        <nova:port uuid="dbc66205-3b88-45f4-93d3-e55042c4e27a">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <entry name="serial">84dd40d6-c9ef-4126-9f71-24a269d9f5f3</entry>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <entry name="uuid">84dd40d6-c9ef-4126-9f71-24a269d9f5f3</entry>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.config"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:b7:2b:8c"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <target dev="tapdbc66205-3b"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/console.log" append="off"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:20:07 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:20:07 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:20:07 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:20:07 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.268 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Preparing to wait for external event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.269 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.269 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.269 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.270 2 DEBUG nova.virt.libvirt.vif [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:20:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-738753080',display_name='tempest-ServersNegativeTestJSON-server-738753080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-738753080',id=105,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-5s1z3elh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:20:03Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=84dd40d6-c9ef-4126-9f71-24a269d9f5f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.270 2 DEBUG nova.network.os_vif_util [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.270 2 DEBUG nova.network.os_vif_util [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.271 2 DEBUG os_vif [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.275 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbc66205-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.275 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdbc66205-3b, col_values=(('external_ids', {'iface-id': 'dbc66205-3b88-45f4-93d3-e55042c4e27a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:2b:8c', 'vm-uuid': '84dd40d6-c9ef-4126-9f71-24a269d9f5f3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005466012 NetworkManager[51207]: <info>  [1759407607.2777] manager: (tapdbc66205-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.284 2 INFO os_vif [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b')#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.334 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.335 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.335 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] No VIF found with MAC fa:16:3e:b7:2b:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.336 2 INFO nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Using config drive#033[00m
Oct  2 08:20:07 np0005466012 nova_compute[192063]: 2025-10-02 12:20:07.689 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.542 2 INFO nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Creating config drive at /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.config#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.548 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr1b6f3jn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.675 2 DEBUG oslo_concurrency.processutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr1b6f3jn" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:08 np0005466012 kernel: tapdbc66205-3b: entered promiscuous mode
Oct  2 08:20:08 np0005466012 NetworkManager[51207]: <info>  [1759407608.7421] manager: (tapdbc66205-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/168)
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:08Z|00371|binding|INFO|Claiming lport dbc66205-3b88-45f4-93d3-e55042c4e27a for this chassis.
Oct  2 08:20:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:08Z|00372|binding|INFO|dbc66205-3b88-45f4-93d3-e55042c4e27a: Claiming fa:16:3e:b7:2b:8c 10.100.0.11
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.759 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:2b:8c 10.100.0.11'], port_security=['fa:16:3e:b7:2b:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '84dd40d6-c9ef-4126-9f71-24a269d9f5f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f494075-66bf-4ce0-a765-98fd91c31199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eb030dcc-72ea-4850-916a-e1df7c4d9a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43b5827-85bf-4b83-b921-ec45e12f1f2e, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=dbc66205-3b88-45f4-93d3-e55042c4e27a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.760 103246 INFO neutron.agent.ovn.metadata.agent [-] Port dbc66205-3b88-45f4-93d3-e55042c4e27a in datapath 8f494075-66bf-4ce0-a765-98fd91c31199 bound to our chassis#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.761 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f494075-66bf-4ce0-a765-98fd91c31199#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.774 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e7423b48-baca-4830-ac13-d2a52d4f3e6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.775 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f494075-61 in ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.777 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f494075-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.777 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6061b2-3d94-4bdd-a4f1-a3eb2df3ae0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.778 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a09e3032-3ddf-4bec-85ca-f2a89f07c15d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 systemd-udevd[234649]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:20:08 np0005466012 NetworkManager[51207]: <info>  [1759407608.7918] device (tapdbc66205-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:20:08 np0005466012 systemd-machined[152114]: New machine qemu-45-instance-00000069.
Oct  2 08:20:08 np0005466012 NetworkManager[51207]: <info>  [1759407608.7930] device (tapdbc66205-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.792 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1fb392-ad17-4064-b5ad-a4b46739d07a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.807 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0a8bf537-5c18-4b68-b8eb-4df1dda99e2d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:08Z|00373|binding|INFO|Setting lport dbc66205-3b88-45f4-93d3-e55042c4e27a ovn-installed in OVS
Oct  2 08:20:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:08Z|00374|binding|INFO|Setting lport dbc66205-3b88-45f4-93d3-e55042c4e27a up in Southbound
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:08 np0005466012 systemd[1]: Started Virtual Machine qemu-45-instance-00000069.
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.835 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b5cd6d-4c63-4a7a-bfdb-2ef792c0a9c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 NetworkManager[51207]: <info>  [1759407608.8415] manager: (tap8f494075-60): new Veth device (/org/freedesktop/NetworkManager/Devices/169)
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.842 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[30639ef5-9102-4a20-8582-f6892b38dd1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.871 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fef9254a-9ae5-4d18-86d9-6f9ce65af49a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.874 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8752a921-4549-4719-8d9f-c94725777034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 NetworkManager[51207]: <info>  [1759407608.8930] device (tap8f494075-60): carrier: link connected
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.898 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5657e8-4d5b-4af0-8249-2dc2d2b9c2e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.913 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5c141322-0d45-4e86-89fc-4d7a264f51af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f494075-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:9a:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560250, 'reachable_time': 34382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234682, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.930 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2882df78-9e98-4dbe-abaf-e2187f59bbaf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:9a65'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560250, 'tstamp': 560250}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234683, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.948 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cb88fe43-103c-46f3-b03a-4560ec8219eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f494075-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:9a:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560250, 'reachable_time': 34382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234684, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.966 2 DEBUG nova.compute.manager [req-d911a5e6-dc54-4109-8222-b2e298dc4b23 req-f444f955-203e-4b3d-81b7-fd4da0f2bd92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.967 2 DEBUG oslo_concurrency.lockutils [req-d911a5e6-dc54-4109-8222-b2e298dc4b23 req-f444f955-203e-4b3d-81b7-fd4da0f2bd92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.967 2 DEBUG oslo_concurrency.lockutils [req-d911a5e6-dc54-4109-8222-b2e298dc4b23 req-f444f955-203e-4b3d-81b7-fd4da0f2bd92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.967 2 DEBUG oslo_concurrency.lockutils [req-d911a5e6-dc54-4109-8222-b2e298dc4b23 req-f444f955-203e-4b3d-81b7-fd4da0f2bd92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:08 np0005466012 nova_compute[192063]: 2025-10-02 12:20:08.967 2 DEBUG nova.compute.manager [req-d911a5e6-dc54-4109-8222-b2e298dc4b23 req-f444f955-203e-4b3d-81b7-fd4da0f2bd92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Processing event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:20:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:08.976 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[961ac1e5-7f31-4af1-9d81-caab70b0749a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.029 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1060bb-4cee-434c-9cae-3702d1a0dffe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.030 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f494075-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.031 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.031 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f494075-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:09 np0005466012 NetworkManager[51207]: <info>  [1759407609.0335] manager: (tap8f494075-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Oct  2 08:20:09 np0005466012 kernel: tap8f494075-60: entered promiscuous mode
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.036 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f494075-60, col_values=(('external_ids', {'iface-id': 'a5eb523a-b004-42b7-a3f6-24b2514f40bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:09Z|00375|binding|INFO|Releasing lport a5eb523a-b004-42b7-a3f6-24b2514f40bf from this chassis (sb_readonly=0)
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.049 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.050 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[948d6b09-accb-4460-a380-526f6e58d454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.051 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-8f494075-66bf-4ce0-a765-98fd91c31199
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/8f494075-66bf-4ce0-a765-98fd91c31199.pid.haproxy
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 8f494075-66bf-4ce0-a765-98fd91c31199
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:20:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:09.053 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'env', 'PROCESS_TAG=haproxy-8f494075-66bf-4ce0-a765-98fd91c31199', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f494075-66bf-4ce0-a765-98fd91c31199.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:20:09 np0005466012 podman[234723]: 2025-10-02 12:20:09.392972811 +0000 UTC m=+0.027686646 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.501 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407609.5010452, 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.501 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.503 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.506 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.508 2 INFO nova.virt.libvirt.driver [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Instance spawned successfully.#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.508 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.524 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.528 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.532 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.532 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.533 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.533 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.533 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.534 2 DEBUG nova.virt.libvirt.driver [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.560 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.560 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407609.5032532, 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.560 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.586 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.588 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407609.5059857, 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.588 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.618 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.620 2 INFO nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Took 6.42 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.621 2 DEBUG nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.622 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.676 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.717 2 INFO nova.compute.manager [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Took 6.93 seconds to build instance.#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.739 2 DEBUG oslo_concurrency.lockutils [None req-71be97c0-04d3-4983-bfab-80f735eaae11 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.827 2 DEBUG nova.network.neutron [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Updated VIF entry in instance network info cache for port dbc66205-3b88-45f4-93d3-e55042c4e27a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.827 2 DEBUG nova.network.neutron [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Updating instance_info_cache with network_info: [{"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:09 np0005466012 nova_compute[192063]: 2025-10-02 12:20:09.847 2 DEBUG oslo_concurrency.lockutils [req-90865bc6-e192-4d06-a7be-b127fe4ff536 req-0d919abd-7e65-4447-9637-575d7da34499 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-84dd40d6-c9ef-4126-9f71-24a269d9f5f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:09 np0005466012 podman[234723]: 2025-10-02 12:20:09.924416575 +0000 UTC m=+0.559130340 container create 20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:20:09 np0005466012 systemd[1]: Started libpod-conmon-20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4.scope.
Oct  2 08:20:09 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:20:09 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/715031656dee353963dc2c63b6c5f76b092424fc242ec44602ce0033f2df6147/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:20:10 np0005466012 podman[234723]: 2025-10-02 12:20:10.109334247 +0000 UTC m=+0.744048062 container init 20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:20:10 np0005466012 podman[234723]: 2025-10-02 12:20:10.114928407 +0000 UTC m=+0.749642222 container start 20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [NOTICE]   (234741) : New worker (234743) forked
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [NOTICE]   (234741) : Loading success.
Oct  2 08:20:10 np0005466012 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000066.scope: Deactivated successfully.
Oct  2 08:20:10 np0005466012 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000066.scope: Consumed 12.357s CPU time.
Oct  2 08:20:10 np0005466012 systemd-machined[152114]: Machine qemu-44-instance-00000066 terminated.
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.344 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.345 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.345 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.345 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.345 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.355 2 INFO nova.compute.manager [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Terminating instance#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.379 2 DEBUG nova.compute.manager [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:10 np0005466012 kernel: tapdbc66205-3b (unregistering): left promiscuous mode
Oct  2 08:20:10 np0005466012 NetworkManager[51207]: <info>  [1759407610.4624] device (tapdbc66205-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:20:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:10Z|00376|binding|INFO|Releasing lport dbc66205-3b88-45f4-93d3-e55042c4e27a from this chassis (sb_readonly=0)
Oct  2 08:20:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:10Z|00377|binding|INFO|Setting lport dbc66205-3b88-45f4-93d3-e55042c4e27a down in Southbound
Oct  2 08:20:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:20:10Z|00378|binding|INFO|Removing iface tapdbc66205-3b ovn-installed in OVS
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:10.488 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:2b:8c 10.100.0.11'], port_security=['fa:16:3e:b7:2b:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '84dd40d6-c9ef-4126-9f71-24a269d9f5f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f494075-66bf-4ce0-a765-98fd91c31199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0c8c8a8631b4721beed577a99f8bdb7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'eb030dcc-72ea-4850-916a-e1df7c4d9a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43b5827-85bf-4b83-b921-ec45e12f1f2e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=dbc66205-3b88-45f4-93d3-e55042c4e27a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:10.489 103246 INFO neutron.agent.ovn.metadata.agent [-] Port dbc66205-3b88-45f4-93d3-e55042c4e27a in datapath 8f494075-66bf-4ce0-a765-98fd91c31199 unbound from our chassis#033[00m
Oct  2 08:20:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:10.491 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f494075-66bf-4ce0-a765-98fd91c31199, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:20:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:10.492 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2183af6a-4bd0-4cbd-97ef-d83c1b94aaac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:10.492 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 namespace which is not needed anymore#033[00m
Oct  2 08:20:10 np0005466012 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Deactivated successfully.
Oct  2 08:20:10 np0005466012 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Consumed 1.395s CPU time.
Oct  2 08:20:10 np0005466012 systemd-machined[152114]: Machine qemu-45-instance-00000069 terminated.
Oct  2 08:20:10 np0005466012 NetworkManager[51207]: <info>  [1759407610.5934] manager: (tapdbc66205-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.632 2 INFO nova.virt.libvirt.driver [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Instance destroyed successfully.#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.633 2 DEBUG nova.objects.instance [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lazy-loading 'resources' on Instance uuid 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.681 2 DEBUG nova.virt.libvirt.vif [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:20:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-738753080',display_name='tempest-ServersNegativeTestJSON-server-738753080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-738753080',id=105,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:20:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0c8c8a8631b4721beed577a99f8bdb7',ramdisk_id='',reservation_id='r-5s1z3elh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-114354241',owner_user_name='tempest-ServersNegativeTestJSON-114354241-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:20:09Z,user_data=None,user_id='a803afe9939346088252c3b944f124f2',uuid=84dd40d6-c9ef-4126-9f71-24a269d9f5f3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.681 2 DEBUG nova.network.os_vif_util [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converting VIF {"id": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "address": "fa:16:3e:b7:2b:8c", "network": {"id": "8f494075-66bf-4ce0-a765-98fd91c31199", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1553125421-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0c8c8a8631b4721beed577a99f8bdb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbc66205-3b", "ovs_interfaceid": "dbc66205-3b88-45f4-93d3-e55042c4e27a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.682 2 DEBUG nova.network.os_vif_util [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.682 2 DEBUG os_vif [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.684 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbc66205-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.689 2 INFO os_vif [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:2b:8c,bridge_name='br-int',has_traffic_filtering=True,id=dbc66205-3b88-45f4-93d3-e55042c4e27a,network=Network(8f494075-66bf-4ce0-a765-98fd91c31199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbc66205-3b')#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.689 2 INFO nova.virt.libvirt.driver [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Deleting instance files /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3_del#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.690 2 INFO nova.virt.libvirt.driver [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Deletion of /var/lib/nova/instances/84dd40d6-c9ef-4126-9f71-24a269d9f5f3_del complete#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.703 2 INFO nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.707 2 INFO nova.virt.libvirt.driver [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance destroyed successfully.#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.711 2 INFO nova.virt.libvirt.driver [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance destroyed successfully.#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.712 2 INFO nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Deleting instance files /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff_del#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.712 2 INFO nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Deletion of /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff_del complete#033[00m
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [NOTICE]   (234741) : haproxy version is 2.8.14-c23fe91
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [NOTICE]   (234741) : path to executable is /usr/sbin/haproxy
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [WARNING]  (234741) : Exiting Master process...
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [WARNING]  (234741) : Exiting Master process...
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [ALERT]    (234741) : Current worker (234743) exited with code 143 (Terminated)
Oct  2 08:20:10 np0005466012 neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199[234737]: [WARNING]  (234741) : All workers exited. Exiting... (0)
Oct  2 08:20:10 np0005466012 systemd[1]: libpod-20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4.scope: Deactivated successfully.
Oct  2 08:20:10 np0005466012 podman[234783]: 2025-10-02 12:20:10.729724685 +0000 UTC m=+0.162342654 container died 20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.853 2 INFO nova.compute.manager [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.854 2 DEBUG oslo.service.loopingcall [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.855 2 DEBUG nova.compute.manager [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:10 np0005466012 nova_compute[192063]: 2025-10-02 12:20:10.855 2 DEBUG nova.network.neutron [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4-userdata-shm.mount: Deactivated successfully.
Oct  2 08:20:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay-715031656dee353963dc2c63b6c5f76b092424fc242ec44602ce0033f2df6147-merged.mount: Deactivated successfully.
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.056 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.056 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.057 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.057 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.057 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] No waiting events found dispatching network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.057 2 WARNING nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received unexpected event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.057 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-vif-unplugged-dbc66205-3b88-45f4-93d3-e55042c4e27a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.058 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.058 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.058 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.058 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] No waiting events found dispatching network-vif-unplugged-dbc66205-3b88-45f4-93d3-e55042c4e27a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.059 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-vif-unplugged-dbc66205-3b88-45f4-93d3-e55042c4e27a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.059 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.059 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.059 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.059 2 DEBUG oslo_concurrency.lockutils [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.060 2 DEBUG nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] No waiting events found dispatching network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.060 2 WARNING nova.compute.manager [req-ccf1d347-774f-4dab-8be2-4f0e2efb6457 req-76da985d-dd7a-4719-bcec-f748f3c281ce 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received unexpected event network-vif-plugged-dbc66205-3b88-45f4-93d3-e55042c4e27a for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:20:11 np0005466012 podman[234783]: 2025-10-02 12:20:11.146883847 +0000 UTC m=+0.579501856 container cleanup 20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:20:11 np0005466012 systemd[1]: libpod-conmon-20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4.scope: Deactivated successfully.
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.322 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.323 2 INFO nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Creating image(s)#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.324 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.324 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.325 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.336 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.392 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.393 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.394 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.420 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.474 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.476 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466012 podman[234830]: 2025-10-02 12:20:11.547219056 +0000 UTC m=+0.366576290 container remove 20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.563 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk 1073741824" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.563 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[db6275b6-6fab-4a05-a58c-958f28879a8e]: (4, ('Thu Oct  2 12:20:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 (20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4)\n20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4\nThu Oct  2 12:20:11 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 (20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4)\n20442640426ff09aac5ff130c17ce64e6d118235a0495de3821fd738440313d4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.564 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.564 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.565 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d878180d-0295-48a3-9f73-24bbbbdcaf21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.566 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f494075-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:20:11 np0005466012 kernel: tap8f494075-60: left promiscuous mode
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.588 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[01f76357-5e86-46e1-a727-5a91f7d8a9fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.613 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[28a363c7-8c51-4f74-889e-dc525dd75e3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.614 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d56e910c-4ea0-45e5-9aae-a137ad713bc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.625 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.626 2 DEBUG nova.virt.disk.api [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Checking if we can resize image /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.626 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.632 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7c6ad9ca-a53f-47e3-aae9-f014a7cb8bee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560244, 'reachable_time': 20010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234856, 'error': None, 'target': 'ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 systemd[1]: run-netns-ovnmeta\x2d8f494075\x2d66bf\x2d4ce0\x2da765\x2d98fd91c31199.mount: Deactivated successfully.
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.636 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f494075-66bf-4ce0-a765-98fd91c31199 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:20:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:11.636 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[42102b0d-3695-4809-b100-03185baf7d9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.679 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.680 2 DEBUG nova.virt.disk.api [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Cannot resize image /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.680 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.680 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Ensure instance console log exists: /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.681 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.681 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.681 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.683 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.688 2 WARNING nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.694 2 DEBUG nova.virt.libvirt.host [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.695 2 DEBUG nova.virt.libvirt.host [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.698 2 DEBUG nova.virt.libvirt.host [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.698 2 DEBUG nova.virt.libvirt.host [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.700 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.700 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.700 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.701 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.701 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.701 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.701 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.702 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.702 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.702 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.702 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.702 2 DEBUG nova.virt.hardware [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.703 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.735 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <uuid>a782659d-957b-4570-b4ff-74461541a3ff</uuid>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <name>instance-00000066</name>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerShowV247Test-server-94123705</nova:name>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:20:11</nova:creationTime>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:user uuid="e7efd391ff484c8bb99570302eacb8f4">tempest-ServerShowV247Test-1215164495-project-member</nova:user>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:        <nova:project uuid="12599487474040b285ccdd017a8c01b5">tempest-ServerShowV247Test-1215164495</nova:project>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <entry name="serial">a782659d-957b-4570-b4ff-74461541a3ff</entry>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <entry name="uuid">a782659d-957b-4570-b4ff-74461541a3ff</entry>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/console.log" append="off"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:20:11 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:20:11 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:20:11 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:20:11 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.952 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.953 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:11 np0005466012 nova_compute[192063]: 2025-10-02 12:20:11.954 2 INFO nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Using config drive#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.052 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'ec2_ids' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.081 2 DEBUG nova.network.neutron [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.202 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'keypairs' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.215 2 INFO nova.compute.manager [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Took 1.36 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.251 2 DEBUG nova.compute.manager [req-81771cc3-7a4f-4031-b876-ee0bce98b3e6 req-23693359-f9c9-4807-a740-c8316372c6c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Received event network-vif-deleted-dbc66205-3b88-45f4-93d3-e55042c4e27a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.252 2 INFO nova.compute.manager [req-81771cc3-7a4f-4031-b876-ee0bce98b3e6 req-23693359-f9c9-4807-a740-c8316372c6c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Neutron deleted interface dbc66205-3b88-45f4-93d3-e55042c4e27a; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.253 2 DEBUG nova.network.neutron [req-81771cc3-7a4f-4031-b876-ee0bce98b3e6 req-23693359-f9c9-4807-a740-c8316372c6c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.308 2 DEBUG nova.compute.manager [req-81771cc3-7a4f-4031-b876-ee0bce98b3e6 req-23693359-f9c9-4807-a740-c8316372c6c9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Detach interface failed, port_id=dbc66205-3b88-45f4-93d3-e55042c4e27a, reason: Instance 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.339 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.340 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.421 2 DEBUG nova.compute.provider_tree [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.435 2 DEBUG nova.scheduler.client.report [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.459 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.484 2 INFO nova.scheduler.client.report [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Deleted allocations for instance 84dd40d6-c9ef-4126-9f71-24a269d9f5f3#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.511 2 INFO nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Creating config drive at /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.518 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphdgyqmzz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.561 2 DEBUG oslo_concurrency.lockutils [None req-d92e1389-3450-4f5b-9892-08822bf5a209 a803afe9939346088252c3b944f124f2 f0c8c8a8631b4721beed577a99f8bdb7 - - default default] Lock "84dd40d6-c9ef-4126-9f71-24a269d9f5f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:12 np0005466012 nova_compute[192063]: 2025-10-02 12:20:12.649 2 DEBUG oslo_concurrency.processutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphdgyqmzz" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:12 np0005466012 systemd-machined[152114]: New machine qemu-46-instance-00000066.
Oct  2 08:20:12 np0005466012 systemd[1]: Started Virtual Machine qemu-46-instance-00000066.
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.484 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for a782659d-957b-4570-b4ff-74461541a3ff due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.485 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407613.4840147, a782659d-957b-4570-b4ff-74461541a3ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.485 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.487 2 DEBUG nova.compute.manager [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.487 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.491 2 INFO nova.virt.libvirt.driver [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance spawned successfully.#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.492 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.508 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.512 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.520 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.520 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.521 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.521 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.521 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.522 2 DEBUG nova.virt.libvirt.driver [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.540 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.540 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407613.4849298, a782659d-957b-4570-b4ff-74461541a3ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.540 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.567 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.571 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.591 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.597 2 DEBUG nova.compute.manager [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.683 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.683 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.684 2 DEBUG nova.objects.instance [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:20:13 np0005466012 nova_compute[192063]: 2025-10-02 12:20:13.766 2 DEBUG oslo_concurrency.lockutils [None req-8ee30f13-ef0e-44f5-b7a8-3d58b5d6b7f3 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:15 np0005466012 nova_compute[192063]: 2025-10-02 12:20:15.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.565 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "a782659d-957b-4570-b4ff-74461541a3ff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.565 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "a782659d-957b-4570-b4ff-74461541a3ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.566 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "a782659d-957b-4570-b4ff-74461541a3ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.566 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "a782659d-957b-4570-b4ff-74461541a3ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.566 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "a782659d-957b-4570-b4ff-74461541a3ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.578 2 INFO nova.compute.manager [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Terminating instance#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.589 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "refresh_cache-a782659d-957b-4570-b4ff-74461541a3ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.589 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquired lock "refresh_cache-a782659d-957b-4570-b4ff-74461541a3ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:16 np0005466012 nova_compute[192063]: 2025-10-02 12:20:16.589 2 DEBUG nova.network.neutron [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.251 2 DEBUG nova.network.neutron [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.561 2 DEBUG nova.network.neutron [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.574 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Releasing lock "refresh_cache-a782659d-957b-4570-b4ff-74461541a3ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.575 2 DEBUG nova.compute.manager [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:17 np0005466012 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000066.scope: Deactivated successfully.
Oct  2 08:20:17 np0005466012 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000066.scope: Consumed 4.809s CPU time.
Oct  2 08:20:17 np0005466012 systemd-machined[152114]: Machine qemu-46-instance-00000066 terminated.
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.827 2 INFO nova.virt.libvirt.driver [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance destroyed successfully.#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.828 2 DEBUG nova.objects.instance [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lazy-loading 'resources' on Instance uuid a782659d-957b-4570-b4ff-74461541a3ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.861 2 INFO nova.virt.libvirt.driver [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Deleting instance files /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff_del#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.862 2 INFO nova.virt.libvirt.driver [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Deletion of /var/lib/nova/instances/a782659d-957b-4570-b4ff-74461541a3ff_del complete#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.950 2 INFO nova.compute.manager [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.951 2 DEBUG oslo.service.loopingcall [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.952 2 DEBUG nova.compute.manager [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:17 np0005466012 nova_compute[192063]: 2025-10-02 12:20:17.952 2 DEBUG nova.network.neutron [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.068 2 DEBUG nova.network.neutron [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.083 2 DEBUG nova.network.neutron [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.096 2 INFO nova.compute.manager [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Took 0.14 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.154 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.154 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.290 2 DEBUG nova.compute.provider_tree [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.304 2 DEBUG nova.scheduler.client.report [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.324 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.405 2 INFO nova.scheduler.client.report [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Deleted allocations for instance a782659d-957b-4570-b4ff-74461541a3ff#033[00m
Oct  2 08:20:18 np0005466012 nova_compute[192063]: 2025-10-02 12:20:18.470 2 DEBUG oslo_concurrency.lockutils [None req-47453537-258a-4505-9a9b-10bd20db5f87 e7efd391ff484c8bb99570302eacb8f4 12599487474040b285ccdd017a8c01b5 - - default default] Lock "a782659d-957b-4570-b4ff-74461541a3ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:20 np0005466012 podman[234899]: 2025-10-02 12:20:20.141989185 +0000 UTC m=+0.056670769 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:20:20 np0005466012 podman[234900]: 2025-10-02 12:20:20.163876754 +0000 UTC m=+0.077215489 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  2 08:20:20 np0005466012 nova_compute[192063]: 2025-10-02 12:20:20.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:20 np0005466012 nova_compute[192063]: 2025-10-02 12:20:20.931 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:21 np0005466012 nova_compute[192063]: 2025-10-02 12:20:21.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:23 np0005466012 podman[234947]: 2025-10-02 12:20:23.178419217 +0000 UTC m=+0.079904527 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  2 08:20:24 np0005466012 nova_compute[192063]: 2025-10-02 12:20:24.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466012 podman[234967]: 2025-10-02 12:20:25.128283902 +0000 UTC m=+0.051385387 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.631 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407610.630457, 84dd40d6-c9ef-4126-9f71-24a269d9f5f3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.631 2 INFO nova.compute.manager [-] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.670 2 DEBUG nova.compute.manager [None req-d1dabcfd-a019-4ee4-bf63-b0dd1c532981 - - - - - -] [instance: 84dd40d6-c9ef-4126-9f71-24a269d9f5f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.839 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.840 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.840 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.840 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.993 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.995 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5681MB free_disk=73.38777542114258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.996 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:25 np0005466012 nova_compute[192063]: 2025-10-02 12:20:25.996 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.059 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.060 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.184 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.200 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.227 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.227 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:26 np0005466012 nova_compute[192063]: 2025-10-02 12:20:26.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.224 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.244 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.244 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.339 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "05f593e3-e62a-445f-8731-4932b97c6211" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.340 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "05f593e3-e62a-445f-8731-4932b97c6211" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.360 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.503 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.503 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.509 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.509 2 INFO nova.compute.claims [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.624 2 DEBUG nova.compute.provider_tree [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.637 2 DEBUG nova.scheduler.client.report [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.656 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.657 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.734 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.764 2 INFO nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.784 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.905 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.906 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.907 2 INFO nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Creating image(s)#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.907 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.908 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.908 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.927 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.986 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.987 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:27 np0005466012 nova_compute[192063]: 2025-10-02 12:20:27.988 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.004 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.062 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.064 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.111 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.112 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.113 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.169 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.171 2 DEBUG nova.virt.disk.api [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Checking if we can resize image /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.171 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.230 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.231 2 DEBUG nova.virt.disk.api [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Cannot resize image /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.231 2 DEBUG nova.objects.instance [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'migration_context' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.243 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.244 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Ensure instance console log exists: /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.244 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.244 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.245 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.246 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.251 2 WARNING nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.257 2 DEBUG nova.virt.libvirt.host [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.258 2 DEBUG nova.virt.libvirt.host [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.266 2 DEBUG nova.virt.libvirt.host [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.266 2 DEBUG nova.virt.libvirt.host [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.268 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.269 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.269 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.270 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.270 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.270 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.271 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.271 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.271 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.271 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.272 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.272 2 DEBUG nova.virt.hardware [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.277 2 DEBUG nova.objects.instance [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'pci_devices' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.294 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <uuid>05f593e3-e62a-445f-8731-4932b97c6211</uuid>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <name>instance-0000006d</name>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerShowV254Test-server-2087917674</nova:name>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:20:28</nova:creationTime>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:user uuid="231825b6134348bdbf790910ea83fbab">tempest-ServerShowV254Test-1548670449-project-member</nova:user>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:        <nova:project uuid="a320040ed8af41c284b826572591eadf">tempest-ServerShowV254Test-1548670449</nova:project>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <entry name="serial">05f593e3-e62a-445f-8731-4932b97c6211</entry>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <entry name="uuid">05f593e3-e62a-445f-8731-4932b97c6211</entry>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/console.log" append="off"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:20:28 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:20:28 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:20:28 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:20:28 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.345 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.346 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.347 2 INFO nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Using config drive#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.637 2 INFO nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Creating config drive at /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.647 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa4mmk09g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.796 2 DEBUG oslo_concurrency.processutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa4mmk09g" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:28 np0005466012 nova_compute[192063]: 2025-10-02 12:20:28.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:28 np0005466012 systemd-machined[152114]: New machine qemu-47-instance-0000006d.
Oct  2 08:20:28 np0005466012 systemd[1]: Started Virtual Machine qemu-47-instance-0000006d.
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.610 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407629.6103342, 05f593e3-e62a-445f-8731-4932b97c6211 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.612 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.615 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.615 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.619 2 INFO nova.virt.libvirt.driver [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance spawned successfully.#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.620 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.639 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.644 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.649 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.649 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.649 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.650 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.650 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.650 2 DEBUG nova.virt.libvirt.driver [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.676 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.677 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407629.611598, 05f593e3-e62a-445f-8731-4932b97c6211 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.677 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.710 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.713 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.748 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.764 2 INFO nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Took 1.86 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.765 2 DEBUG nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.846 2 INFO nova.compute.manager [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Took 2.39 seconds to build instance.#033[00m
Oct  2 08:20:29 np0005466012 nova_compute[192063]: 2025-10-02 12:20:29.863 2 DEBUG oslo_concurrency.lockutils [None req-61a6da08-5fd9-4e42-9835-494ecaf0af0b 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "05f593e3-e62a-445f-8731-4932b97c6211" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:30 np0005466012 nova_compute[192063]: 2025-10-02 12:20:30.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.126 2 INFO nova.compute.manager [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Rebuilding instance#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.393 2 DEBUG nova.compute.manager [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.504 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'pci_requests' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.531 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'pci_devices' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.559 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'resources' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.578 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'migration_context' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.604 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:31 np0005466012 nova_compute[192063]: 2025-10-02 12:20:31.608 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:20:32 np0005466012 podman[235030]: 2025-10-02 12:20:32.156473576 +0000 UTC m=+0.069485107 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:20:32 np0005466012 podman[235031]: 2025-10-02 12:20:32.184635945 +0000 UTC m=+0.098085408 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct  2 08:20:32 np0005466012 nova_compute[192063]: 2025-10-02 12:20:32.823 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407617.8228967, a782659d-957b-4570-b4ff-74461541a3ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:32 np0005466012 nova_compute[192063]: 2025-10-02 12:20:32.824 2 INFO nova.compute.manager [-] [instance: a782659d-957b-4570-b4ff-74461541a3ff] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:20:32 np0005466012 nova_compute[192063]: 2025-10-02 12:20:32.922 2 DEBUG nova.compute.manager [None req-12a3c4a4-caaf-4a13-957c-9cc30ed20689 - - - - - -] [instance: a782659d-957b-4570-b4ff-74461541a3ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:35 np0005466012 nova_compute[192063]: 2025-10-02 12:20:35.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.825 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.826 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.846 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-05f593e3-e62a-445f-8731-4932b97c6211" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-05f593e3-e62a-445f-8731-4932b97c6211" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.848 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:20:36 np0005466012 nova_compute[192063]: 2025-10-02 12:20:36.849 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:37 np0005466012 podman[235069]: 2025-10-02 12:20:37.145042888 +0000 UTC m=+0.055640869 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:20:37 np0005466012 podman[235070]: 2025-10-02 12:20:37.158153204 +0000 UTC m=+0.063041471 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:20:37 np0005466012 nova_compute[192063]: 2025-10-02 12:20:37.252 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:38 np0005466012 nova_compute[192063]: 2025-10-02 12:20:38.267 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:38 np0005466012 nova_compute[192063]: 2025-10-02 12:20:38.292 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-05f593e3-e62a-445f-8731-4932b97c6211" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:38 np0005466012 nova_compute[192063]: 2025-10-02 12:20:38.293 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:20:40 np0005466012 nova_compute[192063]: 2025-10-02 12:20:40.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:41 np0005466012 nova_compute[192063]: 2025-10-02 12:20:41.659 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:20:41 np0005466012 nova_compute[192063]: 2025-10-02 12:20:41.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:45 np0005466012 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Oct  2 08:20:45 np0005466012 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006d.scope: Consumed 13.512s CPU time.
Oct  2 08:20:45 np0005466012 systemd-machined[152114]: Machine qemu-47-instance-0000006d terminated.
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.677 2 INFO nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance shutdown successfully after 14 seconds.#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.682 2 INFO nova.virt.libvirt.driver [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance destroyed successfully.#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.685 2 INFO nova.virt.libvirt.driver [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance destroyed successfully.#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.686 2 INFO nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Deleting instance files /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211_del#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.687 2 INFO nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Deletion of /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211_del complete#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.889 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.889 2 INFO nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Creating image(s)#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.890 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.890 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.891 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.904 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.961 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.962 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "d7f074efa852dc950deac120296f6eecf48a40d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.962 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:45 np0005466012 nova_compute[192063]: 2025-10-02 12:20:45.977 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.033 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.034 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.066 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2,backing_fmt=raw /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.066 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "d7f074efa852dc950deac120296f6eecf48a40d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.067 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.123 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.124 2 DEBUG nova.virt.disk.api [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Checking if we can resize image /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.125 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.182 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.183 2 DEBUG nova.virt.disk.api [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Cannot resize image /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.184 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.185 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Ensure instance console log exists: /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.185 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.186 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.186 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.188 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.194 2 WARNING nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.201 2 DEBUG nova.virt.libvirt.host [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.202 2 DEBUG nova.virt.libvirt.host [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.205 2 DEBUG nova.virt.libvirt.host [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.206 2 DEBUG nova.virt.libvirt.host [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.207 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.207 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:28Z,direct_url=<?>,disk_format='qcow2',id=062d9f80-76b6-42ce-bee7-0fb82a008353,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.207 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.207 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.208 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.208 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.208 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.208 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.209 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.209 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.209 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.209 2 DEBUG nova.virt.hardware [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.210 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'vcpu_model' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.230 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <uuid>05f593e3-e62a-445f-8731-4932b97c6211</uuid>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <name>instance-0000006d</name>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerShowV254Test-server-2087917674</nova:name>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:20:46</nova:creationTime>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:user uuid="231825b6134348bdbf790910ea83fbab">tempest-ServerShowV254Test-1548670449-project-member</nova:user>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:        <nova:project uuid="a320040ed8af41c284b826572591eadf">tempest-ServerShowV254Test-1548670449</nova:project>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="062d9f80-76b6-42ce-bee7-0fb82a008353"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <nova:ports/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <entry name="serial">05f593e3-e62a-445f-8731-4932b97c6211</entry>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <entry name="uuid">05f593e3-e62a-445f-8731-4932b97c6211</entry>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/console.log" append="off"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:20:46 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:20:46 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:20:46 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:20:46 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.283 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.283 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.284 2 INFO nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Using config drive#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.313 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'ec2_ids' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.485 2 INFO nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Creating config drive at /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.489 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw8os_myj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.614 2 DEBUG oslo_concurrency.processutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw8os_myj" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:20:46 np0005466012 nova_compute[192063]: 2025-10-02 12:20:46.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:46 np0005466012 systemd-machined[152114]: New machine qemu-48-instance-0000006d.
Oct  2 08:20:46 np0005466012 systemd[1]: Started Virtual Machine qemu-48-instance-0000006d.
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.616 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 05f593e3-e62a-445f-8731-4932b97c6211 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.618 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407647.6156185, 05f593e3-e62a-445f-8731-4932b97c6211 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.619 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.624 2 DEBUG nova.compute.manager [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.625 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.629 2 INFO nova.virt.libvirt.driver [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance spawned successfully.#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.630 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.653 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.656 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.663 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.663 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.663 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.664 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.664 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.665 2 DEBUG nova.virt.libvirt.driver [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.692 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.692 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407647.6167204, 05f593e3-e62a-445f-8731-4932b97c6211 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.693 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] VM Started (Lifecycle Event)#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.724 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.727 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.795 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.808 2 DEBUG nova.compute.manager [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.948 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.949 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:47 np0005466012 nova_compute[192063]: 2025-10-02 12:20:47.950 2 DEBUG nova.objects.instance [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:20:48 np0005466012 nova_compute[192063]: 2025-10-02 12:20:48.020 2 DEBUG oslo_concurrency.lockutils [None req-fbc78759-685d-4ce7-8363-a94014b83519 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.535 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "05f593e3-e62a-445f-8731-4932b97c6211" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.535 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "05f593e3-e62a-445f-8731-4932b97c6211" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.536 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "05f593e3-e62a-445f-8731-4932b97c6211-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.536 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "05f593e3-e62a-445f-8731-4932b97c6211-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.536 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "05f593e3-e62a-445f-8731-4932b97c6211-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.547 2 INFO nova.compute.manager [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Terminating instance#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.558 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "refresh_cache-05f593e3-e62a-445f-8731-4932b97c6211" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.558 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquired lock "refresh_cache-05f593e3-e62a-445f-8731-4932b97c6211" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.558 2 DEBUG nova.network.neutron [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:20:49 np0005466012 nova_compute[192063]: 2025-10-02 12:20:49.769 2 DEBUG nova.network.neutron [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.251 2 DEBUG nova.network.neutron [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.287 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Releasing lock "refresh_cache-05f593e3-e62a-445f-8731-4932b97c6211" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.288 2 DEBUG nova.compute.manager [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:20:50 np0005466012 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Oct  2 08:20:50 np0005466012 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006d.scope: Consumed 3.454s CPU time.
Oct  2 08:20:50 np0005466012 systemd-machined[152114]: Machine qemu-48-instance-0000006d terminated.
Oct  2 08:20:50 np0005466012 podman[235179]: 2025-10-02 12:20:50.399145883 +0000 UTC m=+0.067014826 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:20:50 np0005466012 podman[235180]: 2025-10-02 12:20:50.444498766 +0000 UTC m=+0.110233377 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.551 2 INFO nova.virt.libvirt.driver [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance destroyed successfully.#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.552 2 DEBUG nova.objects.instance [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lazy-loading 'resources' on Instance uuid 05f593e3-e62a-445f-8731-4932b97c6211 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.571 2 INFO nova.virt.libvirt.driver [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Deleting instance files /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211_del#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.572 2 INFO nova.virt.libvirt.driver [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Deletion of /var/lib/nova/instances/05f593e3-e62a-445f-8731-4932b97c6211_del complete#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.637 2 INFO nova.compute.manager [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.638 2 DEBUG oslo.service.loopingcall [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.639 2 DEBUG nova.compute.manager [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.639 2 DEBUG nova.network.neutron [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.785 2 DEBUG nova.network.neutron [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.799 2 DEBUG nova.network.neutron [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.814 2 INFO nova.compute.manager [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Took 0.18 seconds to deallocate network for instance.#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.888 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.889 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.953 2 DEBUG nova.compute.provider_tree [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:20:50 np0005466012 nova_compute[192063]: 2025-10-02 12:20:50.969 2 DEBUG nova.scheduler.client.report [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:20:51 np0005466012 nova_compute[192063]: 2025-10-02 12:20:51.009 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:51 np0005466012 nova_compute[192063]: 2025-10-02 12:20:51.041 2 INFO nova.scheduler.client.report [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Deleted allocations for instance 05f593e3-e62a-445f-8731-4932b97c6211#033[00m
Oct  2 08:20:51 np0005466012 nova_compute[192063]: 2025-10-02 12:20:51.179 2 DEBUG oslo_concurrency.lockutils [None req-bc55a2ff-285c-47e7-8ca4-0b830784b30c 231825b6134348bdbf790910ea83fbab a320040ed8af41c284b826572591eadf - - default default] Lock "05f593e3-e62a-445f-8731-4932b97c6211" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:20:51 np0005466012 nova_compute[192063]: 2025-10-02 12:20:51.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:53.463 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:20:53 np0005466012 nova_compute[192063]: 2025-10-02 12:20:53.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:20:53.464 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:20:54 np0005466012 podman[235237]: 2025-10-02 12:20:54.151494988 +0000 UTC m=+0.058470961 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:20:55 np0005466012 nova_compute[192063]: 2025-10-02 12:20:55.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:20:56 np0005466012 podman[235256]: 2025-10-02 12:20:56.142672279 +0000 UTC m=+0.060625093 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:20:56 np0005466012 nova_compute[192063]: 2025-10-02 12:20:56.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:00 np0005466012 nova_compute[192063]: 2025-10-02 12:21:00.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:01.466 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:01 np0005466012 nova_compute[192063]: 2025-10-02 12:21:01.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:02.131 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:02.132 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:02.132 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:03 np0005466012 podman[235277]: 2025-10-02 12:21:03.171664766 +0000 UTC m=+0.077579640 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:21:03 np0005466012 podman[235278]: 2025-10-02 12:21:03.204615793 +0000 UTC m=+0.108091886 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Oct  2 08:21:05 np0005466012 nova_compute[192063]: 2025-10-02 12:21:05.551 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407650.5495775, 05f593e3-e62a-445f-8731-4932b97c6211 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:05 np0005466012 nova_compute[192063]: 2025-10-02 12:21:05.552 2 INFO nova.compute.manager [-] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:21:05 np0005466012 nova_compute[192063]: 2025-10-02 12:21:05.590 2 DEBUG nova.compute.manager [None req-63d7a27a-dc57-4fdb-ac68-fd8e2ef7936a - - - - - -] [instance: 05f593e3-e62a-445f-8731-4932b97c6211] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:05 np0005466012 nova_compute[192063]: 2025-10-02 12:21:05.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:06 np0005466012 nova_compute[192063]: 2025-10-02 12:21:06.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:08 np0005466012 podman[235321]: 2025-10-02 12:21:08.129429802 +0000 UTC m=+0.047678960 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:21:08 np0005466012 podman[235320]: 2025-10-02 12:21:08.137777892 +0000 UTC m=+0.056868164 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 08:21:10 np0005466012 nova_compute[192063]: 2025-10-02 12:21:10.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:11 np0005466012 nova_compute[192063]: 2025-10-02 12:21:11.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:15 np0005466012 nova_compute[192063]: 2025-10-02 12:21:15.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:16 np0005466012 nova_compute[192063]: 2025-10-02 12:21:16.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.923 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:21:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:21:20 np0005466012 nova_compute[192063]: 2025-10-02 12:21:20.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005466012 podman[235366]: 2025-10-02 12:21:21.140673002 +0000 UTC m=+0.058731498 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:21:21 np0005466012 podman[235367]: 2025-10-02 12:21:21.179641181 +0000 UTC m=+0.094703280 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.383 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.384 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.401 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.526 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.527 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.534 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.534 2 INFO nova.compute.claims [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.701 2 DEBUG nova.compute.provider_tree [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.739 2 DEBUG nova.scheduler.client.report [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.777 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.778 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.839 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.840 2 DEBUG nova.network.neutron [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.860 2 INFO nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:21:21 np0005466012 nova_compute[192063]: 2025-10-02 12:21:21.879 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.054 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.056 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.056 2 INFO nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Creating image(s)#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.057 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.057 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.058 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.075 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.107 2 DEBUG nova.policy [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.147 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.148 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.149 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.172 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.263 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.265 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.301 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.302 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.302 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.360 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.361 2 DEBUG nova.virt.disk.api [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Checking if we can resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.362 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.428 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.429 2 DEBUG nova.virt.disk.api [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Cannot resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.430 2 DEBUG nova.objects.instance [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'migration_context' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.445 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.445 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Ensure instance console log exists: /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.446 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.446 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:22 np0005466012 nova_compute[192063]: 2025-10-02 12:21:22.446 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:23 np0005466012 nova_compute[192063]: 2025-10-02 12:21:23.393 2 DEBUG nova.network.neutron [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Successfully created port: d1031883-2135-4183-8a9d-0609c32ad14b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.658 2 DEBUG nova.network.neutron [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Successfully updated port: d1031883-2135-4183-8a9d-0609c32ad14b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.694 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.695 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.695 2 DEBUG nova.network.neutron [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.886 2 DEBUG nova.network.neutron [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.912 2 DEBUG nova.compute.manager [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-changed-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.912 2 DEBUG nova.compute.manager [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Refreshing instance network info cache due to event network-changed-d1031883-2135-4183-8a9d-0609c32ad14b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:24 np0005466012 nova_compute[192063]: 2025-10-02 12:21:24.913 2 DEBUG oslo_concurrency.lockutils [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:25 np0005466012 podman[235427]: 2025-10-02 12:21:25.150774331 +0000 UTC m=+0.066051829 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.846 2 DEBUG nova.network.neutron [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.873 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.873 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance network_info: |[{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.874 2 DEBUG oslo_concurrency.lockutils [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.874 2 DEBUG nova.network.neutron [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Refreshing network info cache for port d1031883-2135-4183-8a9d-0609c32ad14b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.877 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start _get_guest_xml network_info=[{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.881 2 WARNING nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.886 2 DEBUG nova.virt.libvirt.host [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.886 2 DEBUG nova.virt.libvirt.host [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.897 2 DEBUG nova.virt.libvirt.host [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.898 2 DEBUG nova.virt.libvirt.host [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.899 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.899 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.900 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.900 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.900 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.900 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.900 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.901 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.901 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.901 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.901 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.902 2 DEBUG nova.virt.hardware [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.905 2 DEBUG nova.virt.libvirt.vif [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.906 2 DEBUG nova.network.os_vif_util [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.906 2 DEBUG nova.network.os_vif_util [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.907 2 DEBUG nova.objects.instance [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'pci_devices' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.968 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <uuid>ae56113d-001e-4f10-9236-c07fe5146d9c</uuid>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <name>instance-0000006f</name>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerActionsTestJSON-server-161503604</nova:name>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:21:25</nova:creationTime>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:user uuid="d54b1826121b47caba89932a78c06ccd">tempest-ServerActionsTestJSON-1646745100-project-member</nova:user>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:project uuid="e564a4cad5d443dba81ec04d2a05ced9">tempest-ServerActionsTestJSON-1646745100</nova:project>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        <nova:port uuid="d1031883-2135-4183-8a9d-0609c32ad14b">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <entry name="serial">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <entry name="uuid">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0a:b9:ae"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <target dev="tapd1031883-21"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/console.log" append="off"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:21:25 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:21:25 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:21:25 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:21:25 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.969 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Preparing to wait for external event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.970 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.970 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.970 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.971 2 DEBUG nova.virt.libvirt.vif [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:21:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.971 2 DEBUG nova.network.os_vif_util [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.973 2 DEBUG nova.network.os_vif_util [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.973 2 DEBUG os_vif [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.974 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.975 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.977 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1031883-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.978 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1031883-21, col_values=(('external_ids', {'iface-id': 'd1031883-2135-4183-8a9d-0609c32ad14b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:b9:ae', 'vm-uuid': 'ae56113d-001e-4f10-9236-c07fe5146d9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:25 np0005466012 NetworkManager[51207]: <info>  [1759407685.9806] manager: (tapd1031883-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:25 np0005466012 nova_compute[192063]: 2025-10-02 12:21:25.987 2 INFO os_vif [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.206 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.206 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.207 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] No VIF found with MAC fa:16:3e:0a:b9:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.207 2 INFO nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Using config drive#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:26 np0005466012 nova_compute[192063]: 2025-10-02 12:21:26.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.073 2 INFO nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Creating config drive at /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.082 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphvj1fwhk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:27 np0005466012 podman[235449]: 2025-10-02 12:21:27.163263083 +0000 UTC m=+0.069377583 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.209 2 DEBUG oslo_concurrency.processutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphvj1fwhk" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:27 np0005466012 kernel: tapd1031883-21: entered promiscuous mode
Oct  2 08:21:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:27Z|00379|binding|INFO|Claiming lport d1031883-2135-4183-8a9d-0609c32ad14b for this chassis.
Oct  2 08:21:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:27Z|00380|binding|INFO|d1031883-2135-4183-8a9d-0609c32ad14b: Claiming fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 NetworkManager[51207]: <info>  [1759407687.2803] manager: (tapd1031883-21): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.318 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.319 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f bound to our chassis#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.320 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a04f937a-375f-4fb0-90fe-5f514a88668f#033[00m
Oct  2 08:21:27 np0005466012 systemd-udevd[235484]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.335 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[494648bc-4845-4d87-b268-0ed85f12195c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 NetworkManager[51207]: <info>  [1759407687.3388] device (tapd1031883-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:21:27 np0005466012 NetworkManager[51207]: <info>  [1759407687.3397] device (tapd1031883-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.337 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa04f937a-31 in ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.338 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa04f937a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.338 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f3213a40-6a2c-4c8d-893c-d3984693fa07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.341 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b8bdcb9e-68ed-4f50-a83d-f753e7cc1ee8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 systemd-machined[152114]: New machine qemu-49-instance-0000006f.
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:27Z|00381|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b ovn-installed in OVS
Oct  2 08:21:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:27Z|00382|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b up in Southbound
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.356 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee826c2-7795-4e0c-afee-d9c746fc4241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 systemd[1]: Started Virtual Machine qemu-49-instance-0000006f.
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.373 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[95711b29-1f5f-4e09-9b05-bebf2ca499d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.404 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[62de11cc-ba67-468a-aa0d-2ba0ef302db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 NetworkManager[51207]: <info>  [1759407687.4127] manager: (tapa04f937a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Oct  2 08:21:27 np0005466012 systemd-udevd[235488]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.414 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cfa331-f631-4127-84c2-e7fc74163fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.444 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[57ca0329-732b-4567-bd03-30915e769f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.446 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0913cb2f-ec6f-4d98-b5ad-8f53a2c88bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 NetworkManager[51207]: <info>  [1759407687.4687] device (tapa04f937a-30): carrier: link connected
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.473 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e30a9018-53c6-4ad8-8a1e-655720571420]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.492 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f749b957-7df2-4e02-b277-fc68c5748466]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568108, 'reachable_time': 34194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235518, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.506 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9d690c-ffd6-43d3-b876-f752a51b9ee1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:9368'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568108, 'tstamp': 568108}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235519, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.528 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1138652e-c427-42e1-b50f-bfa4f1428d4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568108, 'reachable_time': 34194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235520, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.561 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[82c979f9-9f44-452b-8ff3-dc2be6500563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.563 2 DEBUG nova.network.neutron [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updated VIF entry in instance network info cache for port d1031883-2135-4183-8a9d-0609c32ad14b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.564 2 DEBUG nova.network.neutron [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.635 2 DEBUG oslo_concurrency.lockutils [req-c81c6841-866d-483b-8c74-f53db32ca1b3 req-82ee7d1a-9924-47df-88c6-c30a51cfe5eb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.642 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9d7049-f3ae-45ca-a0e1-d9f44f497371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.644 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.644 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.644 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa04f937a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 kernel: tapa04f937a-30: entered promiscuous mode
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.649 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa04f937a-30, col_values=(('external_ids', {'iface-id': '38f1ac16-18c6-4b4a-b769-ebc7dd5181d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:21:27 np0005466012 NetworkManager[51207]: <info>  [1759407687.6499] manager: (tapa04f937a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Oct  2 08:21:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:27Z|00383|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.666 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.667 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0710a7b1-9732-4788-918c-9db0d6d83991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.668 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:21:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:27.670 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'env', 'PROCESS_TAG=haproxy-a04f937a-375f-4fb0-90fe-5f514a88668f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a04f937a-375f-4fb0-90fe-5f514a88668f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.856 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.856 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.856 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.857 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:21:27 np0005466012 nova_compute[192063]: 2025-10-02 12:21:27.922 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.012 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.013 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.077 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:21:28 np0005466012 podman[235562]: 2025-10-02 12:21:28.004562148 +0000 UTC m=+0.020357637 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.200 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407688.1994483, ae56113d-001e-4f10-9236-c07fe5146d9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.201 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.220 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.223 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407688.2002683, ae56113d-001e-4f10-9236-c07fe5146d9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.223 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.238 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.240 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.241 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5678MB free_disk=73.3871955871582GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.241 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.242 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.243 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.271 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:28 np0005466012 podman[235562]: 2025-10-02 12:21:28.334643578 +0000 UTC m=+0.350439047 container create 56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.334 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance ae56113d-001e-4f10-9236-c07fe5146d9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.335 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.335 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.349 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.371 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.372 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.388 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:21:28 np0005466012 systemd[1]: Started libpod-conmon-56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3.scope.
Oct  2 08:21:28 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.416 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:21:28 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e8d43df106efba6a522bddd20afc3eea5a54b6b4151badcf9201957d8581065/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.453 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.467 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.494 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:21:28 np0005466012 nova_compute[192063]: 2025-10-02 12:21:28.494 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:28 np0005466012 podman[235562]: 2025-10-02 12:21:28.497339352 +0000 UTC m=+0.513134831 container init 56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:21:28 np0005466012 podman[235562]: 2025-10-02 12:21:28.503088477 +0000 UTC m=+0.518883946 container start 56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:21:28 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [NOTICE]   (235586) : New worker (235588) forked
Oct  2 08:21:28 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [NOTICE]   (235586) : Loading success.
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.010 2 DEBUG nova.compute.manager [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.010 2 DEBUG oslo_concurrency.lockutils [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.011 2 DEBUG oslo_concurrency.lockutils [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.011 2 DEBUG oslo_concurrency.lockutils [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.012 2 DEBUG nova.compute.manager [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Processing event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.012 2 DEBUG nova.compute.manager [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.012 2 DEBUG oslo_concurrency.lockutils [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.013 2 DEBUG oslo_concurrency.lockutils [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.013 2 DEBUG oslo_concurrency.lockutils [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.013 2 DEBUG nova.compute.manager [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.013 2 WARNING nova.compute.manager [req-f5686586-cfac-40b8-ba84-c4b4c2733001 req-64af310e-4ef2-4fe0-98b7-5ed368964c89 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.014 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.018 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407689.017819, ae56113d-001e-4f10-9236-c07fe5146d9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.019 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.020 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.022 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance spawned successfully.#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.023 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.048 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.053 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.057 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.058 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.058 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.059 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.060 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.060 2 DEBUG nova.virt.libvirt.driver [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.090 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.158 2 INFO nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Took 7.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.159 2 DEBUG nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.444 2 INFO nova.compute.manager [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Took 7.97 seconds to build instance.#033[00m
Oct  2 08:21:29 np0005466012 nova_compute[192063]: 2025-10-02 12:21:29.487 2 DEBUG oslo_concurrency.lockutils [None req-d3983969-8027-4962-9609-582e10129cf8 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:21:30 np0005466012 nova_compute[192063]: 2025-10-02 12:21:30.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:31 np0005466012 nova_compute[192063]: 2025-10-02 12:21:31.495 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:31 np0005466012 nova_compute[192063]: 2025-10-02 12:21:31.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:31 np0005466012 NetworkManager[51207]: <info>  [1759407691.6995] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Oct  2 08:21:31 np0005466012 NetworkManager[51207]: <info>  [1759407691.7006] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Oct  2 08:21:31 np0005466012 nova_compute[192063]: 2025-10-02 12:21:31.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:31Z|00384|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:21:31 np0005466012 nova_compute[192063]: 2025-10-02 12:21:31.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:32 np0005466012 nova_compute[192063]: 2025-10-02 12:21:32.074 2 DEBUG nova.compute.manager [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-changed-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:21:32 np0005466012 nova_compute[192063]: 2025-10-02 12:21:32.075 2 DEBUG nova.compute.manager [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Refreshing instance network info cache due to event network-changed-d1031883-2135-4183-8a9d-0609c32ad14b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:21:32 np0005466012 nova_compute[192063]: 2025-10-02 12:21:32.075 2 DEBUG oslo_concurrency.lockutils [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:32 np0005466012 nova_compute[192063]: 2025-10-02 12:21:32.076 2 DEBUG oslo_concurrency.lockutils [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:32 np0005466012 nova_compute[192063]: 2025-10-02 12:21:32.076 2 DEBUG nova.network.neutron [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Refreshing network info cache for port d1031883-2135-4183-8a9d-0609c32ad14b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:21:34 np0005466012 podman[235599]: 2025-10-02 12:21:34.144087007 +0000 UTC m=+0.060006584 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41)
Oct  2 08:21:34 np0005466012 podman[235598]: 2025-10-02 12:21:34.165536814 +0000 UTC m=+0.085430846 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  2 08:21:35 np0005466012 nova_compute[192063]: 2025-10-02 12:21:35.165 2 DEBUG nova.network.neutron [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updated VIF entry in instance network info cache for port d1031883-2135-4183-8a9d-0609c32ad14b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:21:35 np0005466012 nova_compute[192063]: 2025-10-02 12:21:35.166 2 DEBUG nova.network.neutron [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:35 np0005466012 nova_compute[192063]: 2025-10-02 12:21:35.192 2 DEBUG oslo_concurrency.lockutils [req-ae2efd54-a212-48b5-8e63-82d8a1cb4b59 req-2daac4dd-515e-4916-b05d-1f5284be612c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:35 np0005466012 nova_compute[192063]: 2025-10-02 12:21:35.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:36 np0005466012 nova_compute[192063]: 2025-10-02 12:21:36.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:38 np0005466012 nova_compute[192063]: 2025-10-02 12:21:38.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:21:38 np0005466012 nova_compute[192063]: 2025-10-02 12:21:38.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:21:38 np0005466012 nova_compute[192063]: 2025-10-02 12:21:38.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:21:39 np0005466012 podman[235639]: 2025-10-02 12:21:39.175724656 +0000 UTC m=+0.086480135 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:21:39 np0005466012 podman[235640]: 2025-10-02 12:21:39.191432068 +0000 UTC m=+0.090775898 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:21:39 np0005466012 nova_compute[192063]: 2025-10-02 12:21:39.283 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:21:39 np0005466012 nova_compute[192063]: 2025-10-02 12:21:39.283 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:21:39 np0005466012 nova_compute[192063]: 2025-10-02 12:21:39.283 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:21:39 np0005466012 nova_compute[192063]: 2025-10-02 12:21:39.283 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:21:41 np0005466012 nova_compute[192063]: 2025-10-02 12:21:41.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005466012 nova_compute[192063]: 2025-10-02 12:21:41.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:41 np0005466012 nova_compute[192063]: 2025-10-02 12:21:41.938 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:21:41 np0005466012 nova_compute[192063]: 2025-10-02 12:21:41.962 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:21:41 np0005466012 nova_compute[192063]: 2025-10-02 12:21:41.963 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:21:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:43Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:21:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:21:43Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:21:46 np0005466012 nova_compute[192063]: 2025-10-02 12:21:46.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:46 np0005466012 nova_compute[192063]: 2025-10-02 12:21:46.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466012 nova_compute[192063]: 2025-10-02 12:21:51.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:51 np0005466012 nova_compute[192063]: 2025-10-02 12:21:51.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:52 np0005466012 podman[235701]: 2025-10-02 12:21:52.170436901 +0000 UTC m=+0.075155220 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:21:52 np0005466012 podman[235702]: 2025-10-02 12:21:52.235781758 +0000 UTC m=+0.135716549 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct  2 08:21:56 np0005466012 nova_compute[192063]: 2025-10-02 12:21:56.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:56 np0005466012 podman[235747]: 2025-10-02 12:21:56.182597078 +0000 UTC m=+0.094736042 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:21:56 np0005466012 nova_compute[192063]: 2025-10-02 12:21:56.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:21:58 np0005466012 podman[235766]: 2025-10-02 12:21:58.150613174 +0000 UTC m=+0.066148621 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:21:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:58.550 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:21:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:21:58.551 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:21:58 np0005466012 nova_compute[192063]: 2025-10-02 12:21:58.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:01 np0005466012 nova_compute[192063]: 2025-10-02 12:22:01.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:01 np0005466012 nova_compute[192063]: 2025-10-02 12:22:01.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:02.132 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:02.133 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:02.134 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.686 2 DEBUG nova.compute.manager [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.799 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.800 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.820 2 DEBUG nova.objects.instance [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.835 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.835 2 INFO nova.compute.claims [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.835 2 DEBUG nova.objects.instance [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'resources' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.850 2 DEBUG nova.objects.instance [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.863 2 DEBUG nova.objects.instance [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.907 2 INFO nova.compute.resource_tracker [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updating resource usage from migration f797b564-4367-4f96-bd5b-f7913b36cb65#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.907 2 DEBUG nova.compute.resource_tracker [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Starting to track incoming migration f797b564-4367-4f96-bd5b-f7913b36cb65 with flavor 9ac83da7-f31e-4467-8569-d28002f6aeed _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.968 2 DEBUG nova.compute.provider_tree [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:02 np0005466012 nova_compute[192063]: 2025-10-02 12:22:02.981 2 DEBUG nova.scheduler.client.report [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:03 np0005466012 nova_compute[192063]: 2025-10-02 12:22:03.002 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:03 np0005466012 nova_compute[192063]: 2025-10-02 12:22:03.003 2 INFO nova.compute.manager [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Migrating#033[00m
Oct  2 08:22:05 np0005466012 podman[235787]: 2025-10-02 12:22:05.141445934 +0000 UTC m=+0.057247814 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct  2 08:22:05 np0005466012 podman[235786]: 2025-10-02 12:22:05.142713211 +0000 UTC m=+0.057357599 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:05 np0005466012 systemd[1]: Created slice User Slice of UID 42436.
Oct  2 08:22:05 np0005466012 nova_compute[192063]: 2025-10-02 12:22:05.678 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:05 np0005466012 nova_compute[192063]: 2025-10-02 12:22:05.678 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:05 np0005466012 nova_compute[192063]: 2025-10-02 12:22:05.678 2 INFO nova.compute.manager [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Rebooting instance#033[00m
Oct  2 08:22:05 np0005466012 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  2 08:22:05 np0005466012 systemd-logind[827]: New session 40 of user nova.
Oct  2 08:22:05 np0005466012 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  2 08:22:05 np0005466012 systemd[1]: Starting User Manager for UID 42436...
Oct  2 08:22:05 np0005466012 nova_compute[192063]: 2025-10-02 12:22:05.708 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:05 np0005466012 nova_compute[192063]: 2025-10-02 12:22:05.708 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:05 np0005466012 nova_compute[192063]: 2025-10-02 12:22:05.708 2 DEBUG nova.network.neutron [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:05 np0005466012 systemd[235833]: Queued start job for default target Main User Target.
Oct  2 08:22:05 np0005466012 systemd[235833]: Created slice User Application Slice.
Oct  2 08:22:05 np0005466012 systemd[235833]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:22:05 np0005466012 systemd[235833]: Started Daily Cleanup of User's Temporary Directories.
Oct  2 08:22:05 np0005466012 systemd[235833]: Reached target Paths.
Oct  2 08:22:05 np0005466012 systemd[235833]: Reached target Timers.
Oct  2 08:22:05 np0005466012 systemd[235833]: Starting D-Bus User Message Bus Socket...
Oct  2 08:22:05 np0005466012 systemd[235833]: Starting Create User's Volatile Files and Directories...
Oct  2 08:22:05 np0005466012 systemd[235833]: Finished Create User's Volatile Files and Directories.
Oct  2 08:22:05 np0005466012 systemd[235833]: Listening on D-Bus User Message Bus Socket.
Oct  2 08:22:05 np0005466012 systemd[235833]: Reached target Sockets.
Oct  2 08:22:05 np0005466012 systemd[235833]: Reached target Basic System.
Oct  2 08:22:05 np0005466012 systemd[235833]: Reached target Main User Target.
Oct  2 08:22:05 np0005466012 systemd[235833]: Startup finished in 137ms.
Oct  2 08:22:05 np0005466012 systemd[1]: Started User Manager for UID 42436.
Oct  2 08:22:05 np0005466012 systemd[1]: Started Session 40 of User nova.
Oct  2 08:22:05 np0005466012 systemd[1]: session-40.scope: Deactivated successfully.
Oct  2 08:22:05 np0005466012 systemd-logind[827]: Session 40 logged out. Waiting for processes to exit.
Oct  2 08:22:05 np0005466012 systemd-logind[827]: Removed session 40.
Oct  2 08:22:06 np0005466012 systemd-logind[827]: New session 42 of user nova.
Oct  2 08:22:06 np0005466012 systemd[1]: Started Session 42 of User nova.
Oct  2 08:22:06 np0005466012 systemd[1]: session-42.scope: Deactivated successfully.
Oct  2 08:22:06 np0005466012 systemd-logind[827]: Session 42 logged out. Waiting for processes to exit.
Oct  2 08:22:06 np0005466012 systemd-logind[827]: Removed session 42.
Oct  2 08:22:06 np0005466012 nova_compute[192063]: 2025-10-02 12:22:06.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:06 np0005466012 nova_compute[192063]: 2025-10-02 12:22:06.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.245 2 DEBUG nova.network.neutron [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.262 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.272 2 DEBUG nova.compute.manager [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:07 np0005466012 kernel: tapd1031883-21 (unregistering): left promiscuous mode
Oct  2 08:22:07 np0005466012 NetworkManager[51207]: <info>  [1759407727.4475] device (tapd1031883-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00385|binding|INFO|Releasing lport d1031883-2135-4183-8a9d-0609c32ad14b from this chassis (sb_readonly=0)
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00386|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b down in Southbound
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00387|binding|INFO|Removing iface tapd1031883-21 ovn-installed in OVS
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.512 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.514 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f unbound from our chassis#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.515 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a04f937a-375f-4fb0-90fe-5f514a88668f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.517 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d48bdb84-da18-4b27-87d4-d30984850d08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.518 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace which is not needed anymore#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Oct  2 08:22:07 np0005466012 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Consumed 14.015s CPU time.
Oct  2 08:22:07 np0005466012 systemd-machined[152114]: Machine qemu-49-instance-0000006f terminated.
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.552 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:07 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [NOTICE]   (235586) : haproxy version is 2.8.14-c23fe91
Oct  2 08:22:07 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [NOTICE]   (235586) : path to executable is /usr/sbin/haproxy
Oct  2 08:22:07 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [WARNING]  (235586) : Exiting Master process...
Oct  2 08:22:07 np0005466012 kernel: tapd1031883-21: entered promiscuous mode
Oct  2 08:22:07 np0005466012 NetworkManager[51207]: <info>  [1759407727.6448] manager: (tapd1031883-21): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00388|binding|INFO|Claiming lport d1031883-2135-4183-8a9d-0609c32ad14b for this chassis.
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00389|binding|INFO|d1031883-2135-4183-8a9d-0609c32ad14b: Claiming fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:22:07 np0005466012 kernel: tapd1031883-21 (unregistering): left promiscuous mode
Oct  2 08:22:07 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [ALERT]    (235586) : Current worker (235588) exited with code 143 (Terminated)
Oct  2 08:22:07 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[235582]: [WARNING]  (235586) : All workers exited. Exiting... (0)
Oct  2 08:22:07 np0005466012 systemd[1]: libpod-56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3.scope: Deactivated successfully.
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00390|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b ovn-installed in OVS
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00391|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b up in Southbound
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00392|binding|INFO|Releasing lport d1031883-2135-4183-8a9d-0609c32ad14b from this chassis (sb_readonly=1)
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00393|binding|INFO|Removing iface tapd1031883-21 ovn-installed in OVS
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00394|if_status|INFO|Dropped 7 log messages in last 735 seconds (most recently, 735 seconds ago) due to excessive rate
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00395|if_status|INFO|Not setting lport d1031883-2135-4183-8a9d-0609c32ad14b down as sb is readonly
Oct  2 08:22:07 np0005466012 podman[235879]: 2025-10-02 12:22:07.670325809 +0000 UTC m=+0.063932157 container died 56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.672 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00396|binding|INFO|Releasing lport d1031883-2135-4183-8a9d-0609c32ad14b from this chassis (sb_readonly=0)
Oct  2 08:22:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:07Z|00397|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b down in Southbound
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.694 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.699 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance destroyed successfully.#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.699 2 DEBUG nova.objects.instance [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'resources' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3-userdata-shm.mount: Deactivated successfully.
Oct  2 08:22:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay-6e8d43df106efba6a522bddd20afc3eea5a54b6b4151badcf9201957d8581065-merged.mount: Deactivated successfully.
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.724 2 DEBUG nova.virt.libvirt.vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.725 2 DEBUG nova.network.os_vif_util [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.725 2 DEBUG nova.network.os_vif_util [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.726 2 DEBUG os_vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1031883-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.734 2 INFO os_vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:22:07 np0005466012 podman[235879]: 2025-10-02 12:22:07.737299893 +0000 UTC m=+0.130906211 container cleanup 56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.741 2 DEBUG nova.virt.libvirt.driver [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start _get_guest_xml network_info=[{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:22:07 np0005466012 systemd[1]: libpod-conmon-56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3.scope: Deactivated successfully.
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.745 2 WARNING nova.virt.libvirt.driver [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.750 2 DEBUG nova.virt.libvirt.host [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.751 2 DEBUG nova.virt.libvirt.host [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.754 2 DEBUG nova.virt.libvirt.host [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.756 2 DEBUG nova.virt.libvirt.host [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.757 2 DEBUG nova.virt.libvirt.driver [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.758 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.758 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.758 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.758 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.759 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.759 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.759 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.759 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.759 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.759 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.760 2 DEBUG nova.virt.hardware [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.760 2 DEBUG nova.objects.instance [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.783 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.811 2 DEBUG nova.compute.manager [req-0b4f25ae-845a-40b7-ac48-1d183bbcc91a req-be353821-39a5-4b2a-b562-073bceb2a22f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.811 2 DEBUG oslo_concurrency.lockutils [req-0b4f25ae-845a-40b7-ac48-1d183bbcc91a req-be353821-39a5-4b2a-b562-073bceb2a22f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.811 2 DEBUG oslo_concurrency.lockutils [req-0b4f25ae-845a-40b7-ac48-1d183bbcc91a req-be353821-39a5-4b2a-b562-073bceb2a22f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.812 2 DEBUG oslo_concurrency.lockutils [req-0b4f25ae-845a-40b7-ac48-1d183bbcc91a req-be353821-39a5-4b2a-b562-073bceb2a22f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.812 2 DEBUG nova.compute.manager [req-0b4f25ae-845a-40b7-ac48-1d183bbcc91a req-be353821-39a5-4b2a-b562-073bceb2a22f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.812 2 WARNING nova.compute.manager [req-0b4f25ae-845a-40b7-ac48-1d183bbcc91a req-be353821-39a5-4b2a-b562-073bceb2a22f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state reboot_started_hard.#033[00m
Oct  2 08:22:07 np0005466012 podman[235923]: 2025-10-02 12:22:07.81659192 +0000 UTC m=+0.054960179 container remove 56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.824 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d833e23b-38ab-48df-9b92-18198fd0f2e9]: (4, ('Thu Oct  2 12:22:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3)\n56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3\nThu Oct  2 12:22:07 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3)\n56a52911aec75549d39ea8c9185a4c5108f7d8c069212d133f74e75c993a1fa3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.825 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3da292-05bc-4f8b-9fa4-48758bc6c6d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.826 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 kernel: tapa04f937a-30: left promiscuous mode
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.835 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4ab40d-4320-4fb2-bac6-9e89fb9220bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.855 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.855 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.855 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.856 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.857 2 DEBUG nova.virt.libvirt.vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.857 2 DEBUG nova.network.os_vif_util [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.858 2 DEBUG nova.network.os_vif_util [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.859 2 DEBUG nova.objects.instance [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'pci_devices' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.872 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[33164226-dc74-42af-af79-1cbab426353a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.873 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbf10fa-691d-483e-9737-22dc6bbee8a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.880 2 DEBUG nova.virt.libvirt.driver [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <uuid>ae56113d-001e-4f10-9236-c07fe5146d9c</uuid>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <name>instance-0000006f</name>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerActionsTestJSON-server-161503604</nova:name>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:22:07</nova:creationTime>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:user uuid="d54b1826121b47caba89932a78c06ccd">tempest-ServerActionsTestJSON-1646745100-project-member</nova:user>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:project uuid="e564a4cad5d443dba81ec04d2a05ced9">tempest-ServerActionsTestJSON-1646745100</nova:project>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        <nova:port uuid="d1031883-2135-4183-8a9d-0609c32ad14b">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <entry name="serial">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <entry name="uuid">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0a:b9:ae"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <target dev="tapd1031883-21"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/console.log" append="off"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:22:07 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:22:07 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:22:07 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:22:07 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.881 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.893 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[705f2acb-9fff-489f-85d9-472f551345cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568101, 'reachable_time': 20407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235941, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 systemd[1]: run-netns-ovnmeta\x2da04f937a\x2d375f\x2d4fb0\x2d90fe\x2d5f514a88668f.mount: Deactivated successfully.
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.895 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.896 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[293e56e2-a14a-4260-8f0b-53a9203ce323]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.896 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f unbound from our chassis#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.898 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a04f937a-375f-4fb0-90fe-5f514a88668f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.898 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d446b1b3-ded6-4400-ab24-659afd947bf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.899 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f unbound from our chassis#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.900 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a04f937a-375f-4fb0-90fe-5f514a88668f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:07.900 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[257284ef-1e93-4d2f-963a-e7f893b13fe9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.951 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:07 np0005466012 nova_compute[192063]: 2025-10-02 12:22:07.952 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.010 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.012 2 DEBUG nova.objects.instance [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.026 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.084 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.085 2 DEBUG nova.virt.disk.api [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Checking if we can resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.085 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.143 2 DEBUG oslo_concurrency.processutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.144 2 DEBUG nova.virt.disk.api [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Cannot resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.144 2 DEBUG nova.objects.instance [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'migration_context' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.157 2 DEBUG nova.virt.libvirt.vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.157 2 DEBUG nova.network.os_vif_util [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.158 2 DEBUG nova.network.os_vif_util [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.158 2 DEBUG os_vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.159 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.160 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.162 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1031883-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.163 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1031883-21, col_values=(('external_ids', {'iface-id': 'd1031883-2135-4183-8a9d-0609c32ad14b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:b9:ae', 'vm-uuid': 'ae56113d-001e-4f10-9236-c07fe5146d9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.1653] manager: (tapd1031883-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.170 2 INFO os_vif [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:22:08 np0005466012 kernel: tapd1031883-21: entered promiscuous mode
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.2711] manager: (tapd1031883-21): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Oct  2 08:22:08 np0005466012 systemd-udevd[235860]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:08Z|00398|binding|INFO|Claiming lport d1031883-2135-4183-8a9d-0609c32ad14b for this chassis.
Oct  2 08:22:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:08Z|00399|binding|INFO|d1031883-2135-4183-8a9d-0609c32ad14b: Claiming fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.282 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.284 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f bound to our chassis#033[00m
Oct  2 08:22:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:08Z|00400|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b ovn-installed in OVS
Oct  2 08:22:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:08Z|00401|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b up in Southbound
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.286 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a04f937a-375f-4fb0-90fe-5f514a88668f#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.2971] device (tapd1031883-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.2978] device (tapd1031883-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.303 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[02206e52-f875-47e3-9fdd-8d742b84a751]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.305 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa04f937a-31 in ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.307 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa04f937a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.307 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c44427db-1a4a-4fe5-a0e3-535c43640894]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.308 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc2e602-abff-411c-b6c5-c91c2de4c8cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 systemd-machined[152114]: New machine qemu-50-instance-0000006f.
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.320 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[2e14d44e-b53e-4d5e-8d4a-c0bfebec00f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 systemd[1]: Started Virtual Machine qemu-50-instance-0000006f.
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.350 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7b5020-a6f3-4e2d-9421-81d8554d2487]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.378 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8e13c93f-6664-40a2-8042-cd89125fa6ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.384 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b368ac77-ce72-42fb-a19b-309d82c4af14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.3855] manager: (tapa04f937a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.418 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba1a172-ef07-43c8-83d0-768301bbf0b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.424 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3a31f206-a38e-413f-bd88-3d225e91db28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.4470] device (tapa04f937a-30): carrier: link connected
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.451 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[59974178-80d8-4a96-80d4-54591242a21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.470 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a786c764-d254-4cf3-90bb-e9d07d4716a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572205, 'reachable_time': 27438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236000, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.485 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e46ca4-572f-4f51-a4c2-f37f4e1383b6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:9368'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572205, 'tstamp': 572205}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236001, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.502 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ca095098-f593-43b2-b639-3f51994f4e9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572205, 'reachable_time': 27438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236004, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.533 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7a818687-70ad-423f-9315-606eb8abef39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.594 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c9b2cf-4301-45d1-a649-b6dea774c571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.596 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.597 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.597 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa04f937a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:08 np0005466012 NetworkManager[51207]: <info>  [1759407728.6373] manager: (tapa04f937a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Oct  2 08:22:08 np0005466012 kernel: tapa04f937a-30: entered promiscuous mode
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.639 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa04f937a-30, col_values=(('external_ids', {'iface-id': '38f1ac16-18c6-4b4a-b769-ebc7dd5181d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:08Z|00402|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.658 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.659 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[404f3e4a-94e4-4c13-a96b-3382968f42d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.660 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:22:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:08.661 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'env', 'PROCESS_TAG=haproxy-a04f937a-375f-4fb0-90fe-5f514a88668f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a04f937a-375f-4fb0-90fe-5f514a88668f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.939 2 DEBUG nova.compute.manager [req-3f92c72d-98e6-4610-9983-ab48f180ca80 req-3cd30478-b8fd-4a39-aa1a-c6b4583f7a45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-unplugged-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.939 2 DEBUG oslo_concurrency.lockutils [req-3f92c72d-98e6-4610-9983-ab48f180ca80 req-3cd30478-b8fd-4a39-aa1a-c6b4583f7a45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.940 2 DEBUG oslo_concurrency.lockutils [req-3f92c72d-98e6-4610-9983-ab48f180ca80 req-3cd30478-b8fd-4a39-aa1a-c6b4583f7a45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.940 2 DEBUG oslo_concurrency.lockutils [req-3f92c72d-98e6-4610-9983-ab48f180ca80 req-3cd30478-b8fd-4a39-aa1a-c6b4583f7a45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.940 2 DEBUG nova.compute.manager [req-3f92c72d-98e6-4610-9983-ab48f180ca80 req-3cd30478-b8fd-4a39-aa1a-c6b4583f7a45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] No waiting events found dispatching network-vif-unplugged-fd508257-51ca-4c61-9340-029f9a9e7a75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.940 2 WARNING nova.compute.manager [req-3f92c72d-98e6-4610-9983-ab48f180ca80 req-3cd30478-b8fd-4a39-aa1a-c6b4583f7a45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received unexpected event network-vif-unplugged-fd508257-51ca-4c61-9340-029f9a9e7a75 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.987 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for ae56113d-001e-4f10-9236-c07fe5146d9c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.988 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407728.9868963, ae56113d-001e-4f10-9236-c07fe5146d9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.988 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.991 2 DEBUG nova.compute.manager [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.998 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance rebooted successfully.#033[00m
Oct  2 08:22:08 np0005466012 nova_compute[192063]: 2025-10-02 12:22:08.998 2 DEBUG nova.compute.manager [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.009 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.011 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:09 np0005466012 podman[236041]: 2025-10-02 12:22:09.017766371 +0000 UTC m=+0.062130826 container create 2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.030 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.030 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407728.9902284, ae56113d-001e-4f10-9236-c07fe5146d9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.031 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.054 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.057 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:09 np0005466012 systemd[1]: Started libpod-conmon-2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462.scope.
Oct  2 08:22:09 np0005466012 podman[236041]: 2025-10-02 12:22:08.979002507 +0000 UTC m=+0.023366982 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:22:09 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:22:09 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/057360803f0d995b80312af77b88ab5d4b0e41b5410aaa1a975c20e4a2bf2f0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:22:09 np0005466012 podman[236041]: 2025-10-02 12:22:09.107270681 +0000 UTC m=+0.151635156 container init 2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:09 np0005466012 podman[236041]: 2025-10-02 12:22:09.112607305 +0000 UTC m=+0.156971760 container start 2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:09 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [NOTICE]   (236060) : New worker (236062) forked
Oct  2 08:22:09 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [NOTICE]   (236060) : Loading success.
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.164 2 DEBUG oslo_concurrency.lockutils [None req-ef7bff6b-4092-44f5-b404-7e0099e69eae d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 systemd-logind[827]: New session 43 of user nova.
Oct  2 08:22:09 np0005466012 systemd[1]: Started Session 43 of User nova.
Oct  2 08:22:09 np0005466012 podman[236073]: 2025-10-02 12:22:09.522495297 +0000 UTC m=+0.056964446 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, container_name=iscsid)
Oct  2 08:22:09 np0005466012 podman[236075]: 2025-10-02 12:22:09.524615098 +0000 UTC m=+0.057006308 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.930 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.932 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.932 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.932 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.933 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.933 2 WARNING nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.933 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.934 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.934 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.935 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.935 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.935 2 WARNING nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.936 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.936 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.936 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.937 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.937 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.937 2 WARNING nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.937 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.938 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.938 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.938 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.939 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.939 2 WARNING nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.939 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.940 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.940 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.941 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.941 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.941 2 WARNING nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.942 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.942 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.943 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.943 2 DEBUG oslo_concurrency.lockutils [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.944 2 DEBUG nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:09 np0005466012 nova_compute[192063]: 2025-10-02 12:22:09.944 2 WARNING nova.compute.manager [req-38050df8-9854-4c72-83ef-45562cccc401 req-7dc5b7e9-d3f3-4514-90e1-6044e0bc2f23 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:10 np0005466012 systemd[1]: session-43.scope: Deactivated successfully.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: Session 43 logged out. Waiting for processes to exit.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: Removed session 43.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: New session 44 of user nova.
Oct  2 08:22:10 np0005466012 systemd[1]: Started Session 44 of User nova.
Oct  2 08:22:10 np0005466012 systemd[1]: session-44.scope: Deactivated successfully.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: Session 44 logged out. Waiting for processes to exit.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: Removed session 44.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: New session 45 of user nova.
Oct  2 08:22:10 np0005466012 systemd[1]: Started Session 45 of User nova.
Oct  2 08:22:10 np0005466012 systemd[1]: session-45.scope: Deactivated successfully.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: Session 45 logged out. Waiting for processes to exit.
Oct  2 08:22:10 np0005466012 systemd-logind[827]: Removed session 45.
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.185 2 DEBUG nova.compute.manager [req-98e498b8-c3e5-4a33-ab5d-2ea8583e4528 req-c620dde8-50cb-44be-a5e6-1bf88305e933 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.185 2 DEBUG oslo_concurrency.lockutils [req-98e498b8-c3e5-4a33-ab5d-2ea8583e4528 req-c620dde8-50cb-44be-a5e6-1bf88305e933 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.185 2 DEBUG oslo_concurrency.lockutils [req-98e498b8-c3e5-4a33-ab5d-2ea8583e4528 req-c620dde8-50cb-44be-a5e6-1bf88305e933 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.186 2 DEBUG oslo_concurrency.lockutils [req-98e498b8-c3e5-4a33-ab5d-2ea8583e4528 req-c620dde8-50cb-44be-a5e6-1bf88305e933 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.186 2 DEBUG nova.compute.manager [req-98e498b8-c3e5-4a33-ab5d-2ea8583e4528 req-c620dde8-50cb-44be-a5e6-1bf88305e933 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] No waiting events found dispatching network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.186 2 WARNING nova.compute.manager [req-98e498b8-c3e5-4a33-ab5d-2ea8583e4528 req-c620dde8-50cb-44be-a5e6-1bf88305e933 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received unexpected event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:22:11 np0005466012 nova_compute[192063]: 2025-10-02 12:22:11.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.017 2 DEBUG nova.compute.manager [req-d464ba3c-1fed-4711-bfda-480dcf2c047d req-e1b8dda5-88cb-4782-80d7-ae89d6df46ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.017 2 DEBUG oslo_concurrency.lockutils [req-d464ba3c-1fed-4711-bfda-480dcf2c047d req-e1b8dda5-88cb-4782-80d7-ae89d6df46ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.017 2 DEBUG oslo_concurrency.lockutils [req-d464ba3c-1fed-4711-bfda-480dcf2c047d req-e1b8dda5-88cb-4782-80d7-ae89d6df46ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.017 2 DEBUG oslo_concurrency.lockutils [req-d464ba3c-1fed-4711-bfda-480dcf2c047d req-e1b8dda5-88cb-4782-80d7-ae89d6df46ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.017 2 DEBUG nova.compute.manager [req-d464ba3c-1fed-4711-bfda-480dcf2c047d req-e1b8dda5-88cb-4782-80d7-ae89d6df46ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.018 2 WARNING nova.compute.manager [req-d464ba3c-1fed-4711-bfda-480dcf2c047d req-e1b8dda5-88cb-4782-80d7-ae89d6df46ea 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.047 2 INFO nova.compute.manager [None req-40f6c946-b1c1-4fba-a810-c35ce3890909 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Get console output#033[00m
Oct  2 08:22:12 np0005466012 nova_compute[192063]: 2025-10-02 12:22:12.285 2 INFO nova.network.neutron [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updating port fd508257-51ca-4c61-9340-029f9a9e7a75 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:22:13 np0005466012 nova_compute[192063]: 2025-10-02 12:22:13.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:13 np0005466012 nova_compute[192063]: 2025-10-02 12:22:13.960 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:13 np0005466012 nova_compute[192063]: 2025-10-02 12:22:13.961 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquired lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:13 np0005466012 nova_compute[192063]: 2025-10-02 12:22:13.961 2 DEBUG nova.network.neutron [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:14 np0005466012 nova_compute[192063]: 2025-10-02 12:22:14.075 2 DEBUG nova.compute.manager [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-changed-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:14 np0005466012 nova_compute[192063]: 2025-10-02 12:22:14.076 2 DEBUG nova.compute.manager [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Refreshing instance network info cache due to event network-changed-fd508257-51ca-4c61-9340-029f9a9e7a75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:22:14 np0005466012 nova_compute[192063]: 2025-10-02 12:22:14.076 2 DEBUG oslo_concurrency.lockutils [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.323 2 DEBUG nova.network.neutron [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updating instance_info_cache with network_info: [{"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.372 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Releasing lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.376 2 DEBUG oslo_concurrency.lockutils [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.376 2 DEBUG nova.network.neutron [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Refreshing network info cache for port fd508257-51ca-4c61-9340-029f9a9e7a75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.527 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.528 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.528 2 INFO nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Creating image(s)#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.529 2 DEBUG nova.objects.instance [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.686 2 DEBUG oslo_concurrency.processutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.780 2 DEBUG oslo_concurrency.processutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.782 2 DEBUG nova.virt.disk.api [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Checking if we can resize image /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.783 2 DEBUG oslo_concurrency.processutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.877 2 DEBUG oslo_concurrency.processutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.878 2 DEBUG nova.virt.disk.api [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Cannot resize image /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.899 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.900 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Ensure instance console log exists: /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.900 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.901 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.901 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.904 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Start _get_guest_xml network_info=[{"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1375280567", "vif_mac": "fa:16:3e:5e:8b:77"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.910 2 WARNING nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.915 2 DEBUG nova.virt.libvirt.host [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.916 2 DEBUG nova.virt.libvirt.host [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.919 2 DEBUG nova.virt.libvirt.host [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.920 2 DEBUG nova.virt.libvirt.host [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.921 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.921 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.922 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.922 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.922 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.923 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.923 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.923 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.923 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.924 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.924 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.924 2 DEBUG nova.virt.hardware [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.925 2 DEBUG nova.objects.instance [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:16 np0005466012 nova_compute[192063]: 2025-10-02 12:22:16.973 2 DEBUG oslo_concurrency.processutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.069 2 DEBUG oslo_concurrency.processutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.070 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "/var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.071 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "/var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.072 2 DEBUG oslo_concurrency.lockutils [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "/var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.073 2 DEBUG nova.virt.libvirt.vif [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1275898317',display_name='tempest-TestNetworkAdvancedServerOps-server-1275898317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1275898317',id=112,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOCz+7JmNyQy7JdP1IjSwu02/HePNAJvzHsZBcv8XH13dMGPNzBUuwrRU02GRGGFMvEIz5Lu1u/RVTlkdJCGXW3q1BcgXBVQzMFZYW+dEdgXTOuU2vWkRuKj+JzgzmR88A==',key_name='tempest-TestNetworkAdvancedServerOps-217202803',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-x0vz0bnp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:11Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=2eb08e64-4af9-4c5f-9817-b24d5e5ccce2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1375280567", "vif_mac": "fa:16:3e:5e:8b:77"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.073 2 DEBUG nova.network.os_vif_util [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converting VIF {"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1375280567", "vif_mac": "fa:16:3e:5e:8b:77"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.074 2 DEBUG nova.network.os_vif_util [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.077 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <uuid>2eb08e64-4af9-4c5f-9817-b24d5e5ccce2</uuid>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <name>instance-00000070</name>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1275898317</nova:name>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:22:16</nova:creationTime>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        <nova:port uuid="fd508257-51ca-4c61-9340-029f9a9e7a75">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <entry name="serial">2eb08e64-4af9-4c5f-9817-b24d5e5ccce2</entry>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <entry name="uuid">2eb08e64-4af9-4c5f-9817-b24d5e5ccce2</entry>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk.config"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:5e:8b:77"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <target dev="tapfd508257-51"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/console.log" append="off"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:22:17 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:22:17 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:22:17 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:22:17 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.078 2 DEBUG nova.virt.libvirt.vif [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1275898317',display_name='tempest-TestNetworkAdvancedServerOps-server-1275898317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1275898317',id=112,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOCz+7JmNyQy7JdP1IjSwu02/HePNAJvzHsZBcv8XH13dMGPNzBUuwrRU02GRGGFMvEIz5Lu1u/RVTlkdJCGXW3q1BcgXBVQzMFZYW+dEdgXTOuU2vWkRuKj+JzgzmR88A==',key_name='tempest-TestNetworkAdvancedServerOps-217202803',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-x0vz0bnp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:11Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=2eb08e64-4af9-4c5f-9817-b24d5e5ccce2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1375280567", "vif_mac": "fa:16:3e:5e:8b:77"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.078 2 DEBUG nova.network.os_vif_util [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converting VIF {"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1375280567", "vif_mac": "fa:16:3e:5e:8b:77"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.079 2 DEBUG nova.network.os_vif_util [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.079 2 DEBUG os_vif [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.081 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd508257-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd508257-51, col_values=(('external_ids', {'iface-id': 'fd508257-51ca-4c61-9340-029f9a9e7a75', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:8b:77', 'vm-uuid': '2eb08e64-4af9-4c5f-9817-b24d5e5ccce2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.0878] manager: (tapfd508257-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.095 2 INFO os_vif [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51')#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.192 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.193 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.193 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] No VIF found with MAC fa:16:3e:5e:8b:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.194 2 INFO nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Using config drive#033[00m
Oct  2 08:22:17 np0005466012 kernel: tapfd508257-51: entered promiscuous mode
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.2615] manager: (tapfd508257-51): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Oct  2 08:22:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:17Z|00403|binding|INFO|Claiming lport fd508257-51ca-4c61-9340-029f9a9e7a75 for this chassis.
Oct  2 08:22:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:17Z|00404|binding|INFO|fd508257-51ca-4c61-9340-029f9a9e7a75: Claiming fa:16:3e:5e:8b:77 10.100.0.10
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:17Z|00405|binding|INFO|Setting lport fd508257-51ca-4c61-9340-029f9a9e7a75 ovn-installed in OVS
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 systemd-machined[152114]: New machine qemu-51-instance-00000070.
Oct  2 08:22:17 np0005466012 systemd-udevd[236157]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:17 np0005466012 systemd[1]: Started Virtual Machine qemu-51-instance-00000070.
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.3328] device (tapfd508257-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.3342] device (tapfd508257-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:22:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:17Z|00406|binding|INFO|Setting lport fd508257-51ca-4c61-9340-029f9a9e7a75 up in Southbound
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.352 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:8b:77 10.100.0.10'], port_security=['fa:16:3e:5e:8b:77 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2eb08e64-4af9-4c5f-9817-b24d5e5ccce2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cc787597-8604-4a47-984f-e7d594779894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=007edb9e-bf02-4e5b-b812-8540d6b44a38, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=fd508257-51ca-4c61-9340-029f9a9e7a75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.354 103246 INFO neutron.agent.ovn.metadata.agent [-] Port fd508257-51ca-4c61-9340-029f9a9e7a75 in datapath 043fc82b-ca25-47f8-a78d-d7118d064ecd bound to our chassis#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.355 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 043fc82b-ca25-47f8-a78d-d7118d064ecd#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.372 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[75f38aa8-fd39-47fe-91fe-2e4c601eb3ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.373 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap043fc82b-c1 in ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.380 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap043fc82b-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.380 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a54ca8-7409-45b9-8601-34d237c0c96f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.381 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[98deb8df-2b4d-4e09-964a-cbd5ab5cbc92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.402 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f351255c-353e-437f-8e34-98858d827c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.423 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[294f6450-eb38-40f4-8aac-591dcc2755a3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.473 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5a5e31-1b43-4aad-a76e-8579f0cc587c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.481 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[31a45eb5-5c32-4702-88ce-bc05606b4812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.4824] manager: (tap043fc82b-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.524 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8665d34b-1a9b-4994-998e-c965ccca4cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.533 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[22c3c358-99af-4538-8074-038ac19cfdd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.5576] device (tap043fc82b-c0): carrier: link connected
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.564 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[810139b0-6b3a-4eb8-b6f1-f015fbcee6a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.580 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3395e9-d9cf-4731-8521-b56daac41256]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap043fc82b-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:ff:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573116, 'reachable_time': 35023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236190, 'error': None, 'target': 'ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.600 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[86eb136f-6dcc-4533-8bde-96b4c48341df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:fff1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573116, 'tstamp': 573116}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236191, 'error': None, 'target': 'ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.619 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd44f19-d2d4-4105-93a5-bd93234e74a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap043fc82b-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:ff:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573116, 'reachable_time': 35023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236192, 'error': None, 'target': 'ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.660 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fec3a8bc-6575-4bc2-8805-cd2629a38848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.715 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eefac22c-1b9e-4988-a709-e50392f8f382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.717 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap043fc82b-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.718 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.718 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap043fc82b-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 NetworkManager[51207]: <info>  [1759407737.7208] manager: (tap043fc82b-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Oct  2 08:22:17 np0005466012 kernel: tap043fc82b-c0: entered promiscuous mode
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.724 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap043fc82b-c0, col_values=(('external_ids', {'iface-id': 'b26dbb45-d584-4e58-871b-0b97c246a793'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:17Z|00407|binding|INFO|Releasing lport b26dbb45-d584-4e58-871b-0b97c246a793 from this chassis (sb_readonly=0)
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.726 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/043fc82b-ca25-47f8-a78d-d7118d064ecd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/043fc82b-ca25-47f8-a78d-d7118d064ecd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.727 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[74aa5325-4f4b-4264-bf1c-e5569d8bbc37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.728 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-043fc82b-ca25-47f8-a78d-d7118d064ecd
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/043fc82b-ca25-47f8-a78d-d7118d064ecd.pid.haproxy
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 043fc82b-ca25-47f8-a78d-d7118d064ecd
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:22:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:17.730 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'env', 'PROCESS_TAG=haproxy-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/043fc82b-ca25-47f8-a78d-d7118d064ecd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.984 2 DEBUG oslo_concurrency.lockutils [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.985 2 DEBUG oslo_concurrency.lockutils [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.985 2 DEBUG nova.compute.manager [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.990 2 DEBUG nova.compute.manager [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Oct  2 08:22:17 np0005466012 nova_compute[192063]: 2025-10-02 12:22:17.991 2 DEBUG nova.objects.instance [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'flavor' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.013 2 DEBUG nova.objects.instance [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'info_cache' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.046 2 DEBUG nova.virt.libvirt.driver [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:22:18 np0005466012 podman[236230]: 2025-10-02 12:22:18.12953312 +0000 UTC m=+0.081210424 container create 803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:18 np0005466012 podman[236230]: 2025-10-02 12:22:18.077605348 +0000 UTC m=+0.029282692 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:22:18 np0005466012 systemd[1]: Started libpod-conmon-803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b.scope.
Oct  2 08:22:18 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:22:18 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d379051499e39f1645485b18a1774dfab3bbbbe14108eddeb1a95ae5b7688432/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.224 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407738.2244072, 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.225 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.227 2 DEBUG nova.compute.manager [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.230 2 INFO nova.virt.libvirt.driver [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Instance running successfully.#033[00m
Oct  2 08:22:18 np0005466012 virtqemud[191783]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.233 2 DEBUG nova.virt.libvirt.guest [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.234 2 DEBUG nova.virt.libvirt.driver [None req-a68f2946-0a48-4eec-9d14-00714ea30922 cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  2 08:22:18 np0005466012 podman[236230]: 2025-10-02 12:22:18.283516772 +0000 UTC m=+0.235194066 container init 803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:22:18 np0005466012 podman[236230]: 2025-10-02 12:22:18.290332008 +0000 UTC m=+0.242009302 container start 803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:22:18 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [NOTICE]   (236249) : New worker (236251) forked
Oct  2 08:22:18 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [NOTICE]   (236249) : Loading success.
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.349 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.354 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.390 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.391 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407738.2269404, 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.391 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.449 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.453 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.516 2 DEBUG nova.compute.manager [req-a175e590-5865-4302-b18d-1f463f233927 req-c7e392aa-efc4-450e-8761-096b6d121a14 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.516 2 DEBUG oslo_concurrency.lockutils [req-a175e590-5865-4302-b18d-1f463f233927 req-c7e392aa-efc4-450e-8761-096b6d121a14 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.517 2 DEBUG oslo_concurrency.lockutils [req-a175e590-5865-4302-b18d-1f463f233927 req-c7e392aa-efc4-450e-8761-096b6d121a14 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.522 2 DEBUG oslo_concurrency.lockutils [req-a175e590-5865-4302-b18d-1f463f233927 req-c7e392aa-efc4-450e-8761-096b6d121a14 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.522 2 DEBUG nova.compute.manager [req-a175e590-5865-4302-b18d-1f463f233927 req-c7e392aa-efc4-450e-8761-096b6d121a14 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] No waiting events found dispatching network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:18 np0005466012 nova_compute[192063]: 2025-10-02 12:22:18.523 2 WARNING nova.compute.manager [req-a175e590-5865-4302-b18d-1f463f233927 req-c7e392aa-efc4-450e-8761-096b6d121a14 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received unexpected event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 for instance with vm_state active and task_state resize_finish.#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.007 2 DEBUG nova.network.neutron [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updated VIF entry in instance network info cache for port fd508257-51ca-4c61-9340-029f9a9e7a75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.008 2 DEBUG nova.network.neutron [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updating instance_info_cache with network_info: [{"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.024 2 DEBUG oslo_concurrency.lockutils [req-bd77332c-e66e-4baa-9c97-938ff65fdcdc req-2d328774-423b-467f-89ea-f8eac94d4b06 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:20 np0005466012 systemd[1]: Stopping User Manager for UID 42436...
Oct  2 08:22:20 np0005466012 systemd[235833]: Activating special unit Exit the Session...
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped target Main User Target.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped target Basic System.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped target Paths.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped target Sockets.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped target Timers.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  2 08:22:20 np0005466012 systemd[235833]: Closed D-Bus User Message Bus Socket.
Oct  2 08:22:20 np0005466012 systemd[235833]: Stopped Create User's Volatile Files and Directories.
Oct  2 08:22:20 np0005466012 systemd[235833]: Removed slice User Application Slice.
Oct  2 08:22:20 np0005466012 systemd[235833]: Reached target Shutdown.
Oct  2 08:22:20 np0005466012 systemd[235833]: Finished Exit the Session.
Oct  2 08:22:20 np0005466012 systemd[235833]: Reached target Exit the Session.
Oct  2 08:22:20 np0005466012 systemd[1]: user@42436.service: Deactivated successfully.
Oct  2 08:22:20 np0005466012 systemd[1]: Stopped User Manager for UID 42436.
Oct  2 08:22:20 np0005466012 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  2 08:22:20 np0005466012 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  2 08:22:20 np0005466012 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  2 08:22:20 np0005466012 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  2 08:22:20 np0005466012 systemd[1]: Removed slice User Slice of UID 42436.
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.752 2 DEBUG nova.compute.manager [req-b64f7b32-ff29-4631-992d-5cd9742bafc7 req-6d10e4a0-7508-4187-8a53-dfb10c4a695e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.753 2 DEBUG oslo_concurrency.lockutils [req-b64f7b32-ff29-4631-992d-5cd9742bafc7 req-6d10e4a0-7508-4187-8a53-dfb10c4a695e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.753 2 DEBUG oslo_concurrency.lockutils [req-b64f7b32-ff29-4631-992d-5cd9742bafc7 req-6d10e4a0-7508-4187-8a53-dfb10c4a695e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.753 2 DEBUG oslo_concurrency.lockutils [req-b64f7b32-ff29-4631-992d-5cd9742bafc7 req-6d10e4a0-7508-4187-8a53-dfb10c4a695e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.754 2 DEBUG nova.compute.manager [req-b64f7b32-ff29-4631-992d-5cd9742bafc7 req-6d10e4a0-7508-4187-8a53-dfb10c4a695e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] No waiting events found dispatching network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:20 np0005466012 nova_compute[192063]: 2025-10-02 12:22:20.754 2 WARNING nova.compute.manager [req-b64f7b32-ff29-4631-992d-5cd9742bafc7 req-6d10e4a0-7508-4187-8a53-dfb10c4a695e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received unexpected event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:22:21 np0005466012 nova_compute[192063]: 2025-10-02 12:22:21.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:22 np0005466012 nova_compute[192063]: 2025-10-02 12:22:22.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:22 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:22Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:22:22 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:22Z|00408|binding|INFO|Releasing lport b26dbb45-d584-4e58-871b-0b97c246a793 from this chassis (sb_readonly=0)
Oct  2 08:22:22 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:22Z|00409|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:22:22 np0005466012 nova_compute[192063]: 2025-10-02 12:22:22.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:23 np0005466012 podman[236270]: 2025-10-02 12:22:23.129978774 +0000 UTC m=+0.046013134 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:22:23 np0005466012 podman[236271]: 2025-10-02 12:22:23.157990183 +0000 UTC m=+0.073210930 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:22:23 np0005466012 nova_compute[192063]: 2025-10-02 12:22:23.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:26 np0005466012 nova_compute[192063]: 2025-10-02 12:22:26.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:26 np0005466012 nova_compute[192063]: 2025-10-02 12:22:26.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:27 np0005466012 nova_compute[192063]: 2025-10-02 12:22:27.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:27 np0005466012 podman[236318]: 2025-10-02 12:22:27.144478723 +0000 UTC m=+0.056649797 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:22:27 np0005466012 nova_compute[192063]: 2025-10-02 12:22:27.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:27 np0005466012 nova_compute[192063]: 2025-10-02 12:22:27.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:27 np0005466012 nova_compute[192063]: 2025-10-02 12:22:27.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:27 np0005466012 nova_compute[192063]: 2025-10-02 12:22:27.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:22:28 np0005466012 nova_compute[192063]: 2025-10-02 12:22:28.093 2 DEBUG nova.virt.libvirt.driver [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Oct  2 08:22:28 np0005466012 nova_compute[192063]: 2025-10-02 12:22:28.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:29 np0005466012 podman[236338]: 2025-10-02 12:22:29.188718257 +0000 UTC m=+0.092835058 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:29 np0005466012 nova_compute[192063]: 2025-10-02 12:22:29.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:29 np0005466012 nova_compute[192063]: 2025-10-02 12:22:29.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:29 np0005466012 nova_compute[192063]: 2025-10-02 12:22:29.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:29 np0005466012 nova_compute[192063]: 2025-10-02 12:22:29.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:29 np0005466012 nova_compute[192063]: 2025-10-02 12:22:29.844 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:22:29 np0005466012 nova_compute[192063]: 2025-10-02 12:22:29.910 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.007 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.009 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.066 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.072 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.150 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.151 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.212 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:30 np0005466012 kernel: tapd1031883-21 (unregistering): left promiscuous mode
Oct  2 08:22:30 np0005466012 NetworkManager[51207]: <info>  [1759407750.2373] device (tapd1031883-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:22:30 np0005466012 virtqemud[191783]: An error occurred, but the cause is unknown
Oct  2 08:22:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:30Z|00410|binding|INFO|Releasing lport d1031883-2135-4183-8a9d-0609c32ad14b from this chassis (sb_readonly=0)
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:30Z|00411|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b down in Southbound
Oct  2 08:22:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:30Z|00412|binding|INFO|Removing iface tapd1031883-21 ovn-installed in OVS
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.296 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.248', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.298 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f unbound from our chassis#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.299 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a04f937a-375f-4fb0-90fe-5f514a88668f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.300 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c97d5002-e50c-499e-9828-f7bc1a8990db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.300 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace which is not needed anymore#033[00m
Oct  2 08:22:30 np0005466012 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Oct  2 08:22:30 np0005466012 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006f.scope: Consumed 13.955s CPU time.
Oct  2 08:22:30 np0005466012 systemd-machined[152114]: Machine qemu-50-instance-0000006f terminated.
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.390 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.391 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5351MB free_disk=73.33013153076172GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.391 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.392 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:30 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [NOTICE]   (236060) : haproxy version is 2.8.14-c23fe91
Oct  2 08:22:30 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [NOTICE]   (236060) : path to executable is /usr/sbin/haproxy
Oct  2 08:22:30 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [WARNING]  (236060) : Exiting Master process...
Oct  2 08:22:30 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [ALERT]    (236060) : Current worker (236062) exited with code 143 (Terminated)
Oct  2 08:22:30 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236056]: [WARNING]  (236060) : All workers exited. Exiting... (0)
Oct  2 08:22:30 np0005466012 systemd[1]: libpod-2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462.scope: Deactivated successfully.
Oct  2 08:22:30 np0005466012 podman[236393]: 2025-10-02 12:22:30.433240607 +0000 UTC m=+0.045496469 container died 2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:22:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462-userdata-shm.mount: Deactivated successfully.
Oct  2 08:22:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-057360803f0d995b80312af77b88ab5d4b0e41b5410aaa1a975c20e4a2bf2f0d-merged.mount: Deactivated successfully.
Oct  2 08:22:30 np0005466012 podman[236393]: 2025-10-02 12:22:30.47018138 +0000 UTC m=+0.082437262 container cleanup 2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:22:30 np0005466012 systemd[1]: libpod-conmon-2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462.scope: Deactivated successfully.
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.521 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance ae56113d-001e-4f10-9236-c07fe5146d9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.521 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.521 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.521 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:22:30 np0005466012 podman[236424]: 2025-10-02 12:22:30.540158706 +0000 UTC m=+0.046386284 container remove 2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.548 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[15fdef92-b6e9-46bf-b567-97286110f1ae]: (4, ('Thu Oct  2 12:22:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462)\n2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462\nThu Oct  2 12:22:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462)\n2027aaca97c7ff523c0727380cb91955eb089644284dc49477e9c2281f723462\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.550 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d211c0fc-9345-4210-8065-64ceb8ad7a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.553 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:30 np0005466012 kernel: tapa04f937a-30: left promiscuous mode
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.604 2 DEBUG nova.compute.manager [req-b661e9ec-6af4-47b1-8faa-eb79a8a5a33c req-51a83121-db3b-4175-9f6c-6a9fcd44c1a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.604 2 DEBUG oslo_concurrency.lockutils [req-b661e9ec-6af4-47b1-8faa-eb79a8a5a33c req-51a83121-db3b-4175-9f6c-6a9fcd44c1a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.605 2 DEBUG oslo_concurrency.lockutils [req-b661e9ec-6af4-47b1-8faa-eb79a8a5a33c req-51a83121-db3b-4175-9f6c-6a9fcd44c1a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.605 2 DEBUG oslo_concurrency.lockutils [req-b661e9ec-6af4-47b1-8faa-eb79a8a5a33c req-51a83121-db3b-4175-9f6c-6a9fcd44c1a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.605 2 DEBUG nova.compute.manager [req-b661e9ec-6af4-47b1-8faa-eb79a8a5a33c req-51a83121-db3b-4175-9f6c-6a9fcd44c1a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.605 2 WARNING nova.compute.manager [req-b661e9ec-6af4-47b1-8faa-eb79a8a5a33c req-51a83121-db3b-4175-9f6c-6a9fcd44c1a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state powering-off.#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.613 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dba3a725-2e69-47fc-bb61-afd4e5b7a525]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.652 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[de7f56eb-3292-4cf7-92da-fca17574e584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.654 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9f97a594-dc5c-45dd-aa30-200fa4088a96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.670 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9f22d6-eea4-40cd-98c9-d692109ce3ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572198, 'reachable_time': 30687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236460, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.672 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:22:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:30.673 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[98801a30-5587-4fd1-913a-b1a03ad19c78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:30 np0005466012 systemd[1]: run-netns-ovnmeta\x2da04f937a\x2d375f\x2d4fb0\x2d90fe\x2d5f514a88668f.mount: Deactivated successfully.
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.678 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.702 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.736 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.736 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:30 np0005466012 nova_compute[192063]: 2025-10-02 12:22:30.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.107 2 INFO nova.virt.libvirt.driver [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance shutdown successfully after 13 seconds.#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.112 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance destroyed successfully.#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.112 2 DEBUG nova.objects.instance [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'numa_topology' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.132 2 DEBUG nova.compute.manager [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.283 2 DEBUG oslo_concurrency.lockutils [None req-aa251d9e-d632-4312-a59d-ec90525e1fd6 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.732 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.820 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:31 np0005466012 nova_compute[192063]: 2025-10-02 12:22:31.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.130 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'flavor' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.185 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'info_cache' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.225 2 DEBUG oslo_concurrency.lockutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.226 2 DEBUG oslo_concurrency.lockutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.226 2 DEBUG nova.network.neutron [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.711 2 DEBUG nova.compute.manager [req-e11626e5-b90e-4df7-bd95-b097787f3b9d req-054dee3d-3e07-4318-9990-d8cd6a49724a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.711 2 DEBUG oslo_concurrency.lockutils [req-e11626e5-b90e-4df7-bd95-b097787f3b9d req-054dee3d-3e07-4318-9990-d8cd6a49724a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.711 2 DEBUG oslo_concurrency.lockutils [req-e11626e5-b90e-4df7-bd95-b097787f3b9d req-054dee3d-3e07-4318-9990-d8cd6a49724a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.712 2 DEBUG oslo_concurrency.lockutils [req-e11626e5-b90e-4df7-bd95-b097787f3b9d req-054dee3d-3e07-4318-9990-d8cd6a49724a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.712 2 DEBUG nova.compute.manager [req-e11626e5-b90e-4df7-bd95-b097787f3b9d req-054dee3d-3e07-4318-9990-d8cd6a49724a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:32 np0005466012 nova_compute[192063]: 2025-10-02 12:22:32.712 2 WARNING nova.compute.manager [req-e11626e5-b90e-4df7-bd95-b097787f3b9d req-054dee3d-3e07-4318-9990-d8cd6a49724a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state stopped and task_state powering-on.#033[00m
Oct  2 08:22:32 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:32Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:8b:77 10.100.0.10
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.301 2 DEBUG nova.network.neutron [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.326 2 DEBUG oslo_concurrency.lockutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.354 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance destroyed successfully.#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.354 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'numa_topology' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.364 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'resources' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.379 2 DEBUG nova.virt.libvirt.vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.379 2 DEBUG nova.network.os_vif_util [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.380 2 DEBUG nova.network.os_vif_util [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.380 2 DEBUG os_vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1031883-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.388 2 INFO os_vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.393 2 DEBUG nova.virt.libvirt.driver [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start _get_guest_xml network_info=[{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.397 2 WARNING nova.virt.libvirt.driver [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.400 2 DEBUG nova.virt.libvirt.host [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.401 2 DEBUG nova.virt.libvirt.host [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.404 2 DEBUG nova.virt.libvirt.host [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.404 2 DEBUG nova.virt.libvirt.host [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.405 2 DEBUG nova.virt.libvirt.driver [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.405 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.406 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.406 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.406 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.406 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.407 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.407 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.407 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.407 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.408 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.408 2 DEBUG nova.virt.hardware [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.408 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.429 2 DEBUG nova.virt.libvirt.vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.429 2 DEBUG nova.network.os_vif_util [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.431 2 DEBUG nova.network.os_vif_util [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.431 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'pci_devices' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.458 2 DEBUG nova.virt.libvirt.driver [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <uuid>ae56113d-001e-4f10-9236-c07fe5146d9c</uuid>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <name>instance-0000006f</name>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerActionsTestJSON-server-161503604</nova:name>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:22:34</nova:creationTime>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:user uuid="d54b1826121b47caba89932a78c06ccd">tempest-ServerActionsTestJSON-1646745100-project-member</nova:user>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:project uuid="e564a4cad5d443dba81ec04d2a05ced9">tempest-ServerActionsTestJSON-1646745100</nova:project>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        <nova:port uuid="d1031883-2135-4183-8a9d-0609c32ad14b">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <entry name="serial">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <entry name="uuid">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0a:b9:ae"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <target dev="tapd1031883-21"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/console.log" append="off"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:22:34 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:22:34 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:22:34 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:22:34 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.459 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.520 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.521 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.582 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.584 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.597 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.656 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.658 2 DEBUG nova.virt.disk.api [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Checking if we can resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.658 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.717 2 DEBUG oslo_concurrency.processutils [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.718 2 DEBUG nova.virt.disk.api [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Cannot resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.718 2 DEBUG nova.objects.instance [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'migration_context' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.736 2 DEBUG nova.virt.libvirt.vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:22:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.737 2 DEBUG nova.network.os_vif_util [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.738 2 DEBUG nova.network.os_vif_util [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.738 2 DEBUG os_vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.740 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1031883-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1031883-21, col_values=(('external_ids', {'iface-id': 'd1031883-2135-4183-8a9d-0609c32ad14b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:b9:ae', 'vm-uuid': 'ae56113d-001e-4f10-9236-c07fe5146d9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 NetworkManager[51207]: <info>  [1759407754.7461] manager: (tapd1031883-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.750 2 INFO os_vif [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:22:34 np0005466012 kernel: tapd1031883-21: entered promiscuous mode
Oct  2 08:22:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:34Z|00413|binding|INFO|Claiming lport d1031883-2135-4183-8a9d-0609c32ad14b for this chassis.
Oct  2 08:22:34 np0005466012 NetworkManager[51207]: <info>  [1759407754.8314] manager: (tapd1031883-21): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Oct  2 08:22:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:34Z|00414|binding|INFO|d1031883-2135-4183-8a9d-0609c32ad14b: Claiming fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.840 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.841 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f bound to our chassis#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.843 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a04f937a-375f-4fb0-90fe-5f514a88668f#033[00m
Oct  2 08:22:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:34Z|00415|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b ovn-installed in OVS
Oct  2 08:22:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:34Z|00416|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b up in Southbound
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 nova_compute[192063]: 2025-10-02 12:22:34.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.867 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e008893c-af78-4eee-b51f-aeaa021c04be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.868 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa04f937a-31 in ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.870 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa04f937a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.870 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdeec07-0a9d-4b3a-a165-a8473f363be1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.871 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[abc1020a-f39b-45b8-87cf-d74b871a129d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 systemd-machined[152114]: New machine qemu-52-instance-0000006f.
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.888 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8de7a3-e3c2-413f-89d2-34d9877fdd44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 systemd[1]: Started Virtual Machine qemu-52-instance-0000006f.
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.914 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[46bb849c-0542-4621-8063-6cf804917748]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 systemd-udevd[236499]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:22:34 np0005466012 NetworkManager[51207]: <info>  [1759407754.9447] device (tapd1031883-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:22:34 np0005466012 NetworkManager[51207]: <info>  [1759407754.9454] device (tapd1031883-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.950 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c21cd0cc-6b1c-4402-961b-8bacf0906072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 NetworkManager[51207]: <info>  [1759407754.9585] manager: (tapa04f937a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.957 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b60b966a-1c04-4921-8c5c-7eede057ac84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:34.997 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[10f51557-8c07-4964-acae-5bab7ae239a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.001 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b3b3de-2918-4183-8845-1827e1301af7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 NetworkManager[51207]: <info>  [1759407755.0252] device (tapa04f937a-30): carrier: link connected
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.030 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f03ec82a-48ad-422b-b30c-ba5553aa22b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.047 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2226ca05-79db-4fc6-84e0-3f7f52097f77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574863, 'reachable_time': 36536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236529, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.064 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[705edf99-69c4-4f4a-8737-36d4b9095f6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:9368'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574863, 'tstamp': 574863}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236530, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.096 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e755f652-2472-4e9e-a345-5bef5953f7ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574863, 'reachable_time': 36536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236531, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.133 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f9b357-0554-4d64-bed8-3ed1ea6556fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.219 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d4d4a0-8fdb-4ae4-ab97-19c4fe116698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.220 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.221 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.221 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa04f937a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:35 np0005466012 nova_compute[192063]: 2025-10-02 12:22:35.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:35 np0005466012 NetworkManager[51207]: <info>  [1759407755.2248] manager: (tapa04f937a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Oct  2 08:22:35 np0005466012 kernel: tapa04f937a-30: entered promiscuous mode
Oct  2 08:22:35 np0005466012 nova_compute[192063]: 2025-10-02 12:22:35.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.230 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa04f937a-30, col_values=(('external_ids', {'iface-id': '38f1ac16-18c6-4b4a-b769-ebc7dd5181d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:35 np0005466012 nova_compute[192063]: 2025-10-02 12:22:35.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:35 np0005466012 nova_compute[192063]: 2025-10-02 12:22:35.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.234 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.235 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f89f50b9-887c-45e1-81d1-b8936228635f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.236 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:22:35 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:35.237 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'env', 'PROCESS_TAG=haproxy-a04f937a-375f-4fb0-90fe-5f514a88668f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a04f937a-375f-4fb0-90fe-5f514a88668f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:22:35 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:35Z|00417|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:22:35 np0005466012 nova_compute[192063]: 2025-10-02 12:22:35.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:35 np0005466012 podman[236563]: 2025-10-02 12:22:35.692621916 +0000 UTC m=+0.062584315 container create 11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:22:35 np0005466012 podman[236563]: 2025-10-02 12:22:35.660027786 +0000 UTC m=+0.029990255 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:22:35 np0005466012 systemd[1]: Started libpod-conmon-11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145.scope.
Oct  2 08:22:35 np0005466012 podman[236577]: 2025-10-02 12:22:35.80180069 +0000 UTC m=+0.068724252 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350)
Oct  2 08:22:35 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:22:35 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2dd58116323318ec123f316c706664610a30789b7c9596b2048b855ae155b34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:22:35 np0005466012 podman[236563]: 2025-10-02 12:22:35.826259167 +0000 UTC m=+0.196221596 container init 11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:22:35 np0005466012 podman[236576]: 2025-10-02 12:22:35.82707142 +0000 UTC m=+0.097414299 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:35 np0005466012 podman[236563]: 2025-10-02 12:22:35.834944205 +0000 UTC m=+0.204906594 container start 11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:22:35 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [NOTICE]   (236630) : New worker (236632) forked
Oct  2 08:22:35 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [NOTICE]   (236630) : Loading success.
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.216 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for ae56113d-001e-4f10-9236-c07fe5146d9c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.217 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407756.2164817, ae56113d-001e-4f10-9236-c07fe5146d9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.218 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.222 2 DEBUG nova.compute.manager [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.230 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance rebooted successfully.#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.230 2 DEBUG nova.compute.manager [None req-45af0b8f-ad04-4daf-8a22-ebc8bb40914e d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.242 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.248 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.272 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.273 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407756.2173593, ae56113d-001e-4f10-9236-c07fe5146d9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.274 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.301 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.305 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.971 2 DEBUG nova.compute.manager [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.972 2 DEBUG oslo_concurrency.lockutils [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.973 2 DEBUG oslo_concurrency.lockutils [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.973 2 DEBUG oslo_concurrency.lockutils [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.973 2 DEBUG nova.compute.manager [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.973 2 WARNING nova.compute.manager [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.973 2 DEBUG nova.compute.manager [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.974 2 DEBUG oslo_concurrency.lockutils [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.974 2 DEBUG oslo_concurrency.lockutils [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.974 2 DEBUG oslo_concurrency.lockutils [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.974 2 DEBUG nova.compute.manager [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:36 np0005466012 nova_compute[192063]: 2025-10-02 12:22:36.974 2 WARNING nova.compute.manager [req-9aaee499-8865-47b5-90f6-b71575ecd132 req-3de5cca5-b084-4b2f-8287-bcb27db4b8d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:22:39 np0005466012 nova_compute[192063]: 2025-10-02 12:22:39.165 2 INFO nova.compute.manager [None req-4fc8aed0-db70-45fa-9711-2055810758da 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Get console output#033[00m
Oct  2 08:22:39 np0005466012 nova_compute[192063]: 2025-10-02 12:22:39.169 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:22:39 np0005466012 nova_compute[192063]: 2025-10-02 12:22:39.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:39 np0005466012 nova_compute[192063]: 2025-10-02 12:22:39.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:22:39 np0005466012 nova_compute[192063]: 2025-10-02 12:22:39.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:22:39 np0005466012 nova_compute[192063]: 2025-10-02 12:22:39.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.018 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.019 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.020 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.020 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:40 np0005466012 podman[236642]: 2025-10-02 12:22:40.146610618 +0000 UTC m=+0.052828277 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, tcib_managed=true, container_name=iscsid)
Oct  2 08:22:40 np0005466012 podman[236643]: 2025-10-02 12:22:40.153564066 +0000 UTC m=+0.052507617 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.600 2 DEBUG nova.compute.manager [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-changed-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.601 2 DEBUG nova.compute.manager [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Refreshing instance network info cache due to event network-changed-fd508257-51ca-4c61-9340-029f9a9e7a75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.601 2 DEBUG oslo_concurrency.lockutils [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.602 2 DEBUG oslo_concurrency.lockutils [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.602 2 DEBUG nova.network.neutron [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Refreshing network info cache for port fd508257-51ca-4c61-9340-029f9a9e7a75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.852 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.853 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.854 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.854 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.855 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.880 2 INFO nova.compute.manager [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Terminating instance#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.911 2 DEBUG nova.compute.manager [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:22:40 np0005466012 kernel: tapfd508257-51 (unregistering): left promiscuous mode
Oct  2 08:22:40 np0005466012 NetworkManager[51207]: <info>  [1759407760.9425] device (tapfd508257-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:22:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:40Z|00418|binding|INFO|Releasing lport fd508257-51ca-4c61-9340-029f9a9e7a75 from this chassis (sb_readonly=0)
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:40Z|00419|binding|INFO|Setting lport fd508257-51ca-4c61-9340-029f9a9e7a75 down in Southbound
Oct  2 08:22:40 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:40Z|00420|binding|INFO|Removing iface tapfd508257-51 ovn-installed in OVS
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:40.982 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:8b:77 10.100.0.10'], port_security=['fa:16:3e:5e:8b:77 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2eb08e64-4af9-4c5f-9817-b24d5e5ccce2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cc787597-8604-4a47-984f-e7d594779894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=007edb9e-bf02-4e5b-b812-8540d6b44a38, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=fd508257-51ca-4c61-9340-029f9a9e7a75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:22:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:40.984 103246 INFO neutron.agent.ovn.metadata.agent [-] Port fd508257-51ca-4c61-9340-029f9a9e7a75 in datapath 043fc82b-ca25-47f8-a78d-d7118d064ecd unbound from our chassis#033[00m
Oct  2 08:22:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:40.985 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 043fc82b-ca25-47f8-a78d-d7118d064ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:22:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:40.987 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c251d9-d2f2-45eb-a2df-5d724ef5ca09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:40.987 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd namespace which is not needed anymore#033[00m
Oct  2 08:22:40 np0005466012 nova_compute[192063]: 2025-10-02 12:22:40.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000070.scope: Deactivated successfully.
Oct  2 08:22:41 np0005466012 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000070.scope: Consumed 14.448s CPU time.
Oct  2 08:22:41 np0005466012 systemd-machined[152114]: Machine qemu-51-instance-00000070 terminated.
Oct  2 08:22:41 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [NOTICE]   (236249) : haproxy version is 2.8.14-c23fe91
Oct  2 08:22:41 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [NOTICE]   (236249) : path to executable is /usr/sbin/haproxy
Oct  2 08:22:41 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [WARNING]  (236249) : Exiting Master process...
Oct  2 08:22:41 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [ALERT]    (236249) : Current worker (236251) exited with code 143 (Terminated)
Oct  2 08:22:41 np0005466012 neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd[236245]: [WARNING]  (236249) : All workers exited. Exiting... (0)
Oct  2 08:22:41 np0005466012 systemd[1]: libpod-803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b.scope: Deactivated successfully.
Oct  2 08:22:41 np0005466012 podman[236710]: 2025-10-02 12:22:41.111411731 +0000 UTC m=+0.042942786 container died 803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b-userdata-shm.mount: Deactivated successfully.
Oct  2 08:22:41 np0005466012 systemd[1]: var-lib-containers-storage-overlay-d379051499e39f1645485b18a1774dfab3bbbbe14108eddeb1a95ae5b7688432-merged.mount: Deactivated successfully.
Oct  2 08:22:41 np0005466012 podman[236710]: 2025-10-02 12:22:41.194431198 +0000 UTC m=+0.125962253 container cleanup 803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.196 2 INFO nova.virt.libvirt.driver [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Instance destroyed successfully.#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.196 2 DEBUG nova.objects.instance [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:41 np0005466012 systemd[1]: libpod-conmon-803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b.scope: Deactivated successfully.
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.217 2 DEBUG nova.virt.libvirt.vif [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1275898317',display_name='tempest-TestNetworkAdvancedServerOps-server-1275898317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1275898317',id=112,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOCz+7JmNyQy7JdP1IjSwu02/HePNAJvzHsZBcv8XH13dMGPNzBUuwrRU02GRGGFMvEIz5Lu1u/RVTlkdJCGXW3q1BcgXBVQzMFZYW+dEdgXTOuU2vWkRuKj+JzgzmR88A==',key_name='tempest-TestNetworkAdvancedServerOps-217202803',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:22:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-x0vz0bnp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:22:23Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=2eb08e64-4af9-4c5f-9817-b24d5e5ccce2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.218 2 DEBUG nova.network.os_vif_util [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.218 2 DEBUG nova.network.os_vif_util [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.219 2 DEBUG os_vif [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.220 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd508257-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.226 2 INFO os_vif [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:8b:77,bridge_name='br-int',has_traffic_filtering=True,id=fd508257-51ca-4c61-9340-029f9a9e7a75,network=Network(043fc82b-ca25-47f8-a78d-d7118d064ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd508257-51')#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.227 2 INFO nova.virt.libvirt.driver [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Deleting instance files /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2_del#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.232 2 INFO nova.virt.libvirt.driver [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Deletion of /var/lib/nova/instances/2eb08e64-4af9-4c5f-9817-b24d5e5ccce2_del complete#033[00m
Oct  2 08:22:41 np0005466012 podman[236756]: 2025-10-02 12:22:41.255083268 +0000 UTC m=+0.040195788 container remove 803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.259 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3c743d46-abf1-4649-806d-a47d61ea4d87]: (4, ('Thu Oct  2 12:22:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd (803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b)\n803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b\nThu Oct  2 12:22:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd (803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b)\n803ad189bf4ec0fc4d4d61234da648a35027a4bd72eb886bffb1cc8dd47b7a8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.261 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3dc9c7-54ba-40e1-9575-ba1ad3739bae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.261 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap043fc82b-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 kernel: tap043fc82b-c0: left promiscuous mode
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.276 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[01c87120-bfa5-441e-9ece-be1c800b8832]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.311 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c55d1f02-3d10-4390-b2f4-b495089b8a8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.312 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[30f8d000-5a1a-48c3-b52e-d40e202a40d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.327 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[80a41b55-7ace-4a19-8d6f-41e80248cc21]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573107, 'reachable_time': 34684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236771, 'error': None, 'target': 'ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.329 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-043fc82b-ca25-47f8-a78d-d7118d064ecd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:22:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:22:41.329 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[bec391c3-2398-4e3e-a975-f743779a724c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:22:41 np0005466012 systemd[1]: run-netns-ovnmeta\x2d043fc82b\x2dca25\x2d47f8\x2da78d\x2dd7118d064ecd.mount: Deactivated successfully.
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.343 2 INFO nova.compute.manager [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.344 2 DEBUG oslo.service.loopingcall [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.345 2 DEBUG nova.compute.manager [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.345 2 DEBUG nova.network.neutron [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.384 2 DEBUG nova.compute.manager [req-88a7c1a2-625a-4d42-b9be-d479a7b679f7 req-2dca3712-dca4-4891-9d1b-a62354ec6c92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-unplugged-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.386 2 DEBUG oslo_concurrency.lockutils [req-88a7c1a2-625a-4d42-b9be-d479a7b679f7 req-2dca3712-dca4-4891-9d1b-a62354ec6c92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.387 2 DEBUG oslo_concurrency.lockutils [req-88a7c1a2-625a-4d42-b9be-d479a7b679f7 req-2dca3712-dca4-4891-9d1b-a62354ec6c92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.387 2 DEBUG oslo_concurrency.lockutils [req-88a7c1a2-625a-4d42-b9be-d479a7b679f7 req-2dca3712-dca4-4891-9d1b-a62354ec6c92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.388 2 DEBUG nova.compute.manager [req-88a7c1a2-625a-4d42-b9be-d479a7b679f7 req-2dca3712-dca4-4891-9d1b-a62354ec6c92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] No waiting events found dispatching network-vif-unplugged-fd508257-51ca-4c61-9340-029f9a9e7a75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.388 2 DEBUG nova.compute.manager [req-88a7c1a2-625a-4d42-b9be-d479a7b679f7 req-2dca3712-dca4-4891-9d1b-a62354ec6c92 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-unplugged-fd508257-51ca-4c61-9340-029f9a9e7a75 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.420 2 INFO nova.compute.manager [None req-5b887b13-4743-4324-88f8-6bc531ad3445 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Pausing#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.420 2 DEBUG nova.objects.instance [None req-5b887b13-4743-4324-88f8-6bc531ad3445 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'flavor' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.480 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407761.479643, ae56113d-001e-4f10-9236-c07fe5146d9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.481 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.483 2 DEBUG nova.compute.manager [None req-5b887b13-4743-4324-88f8-6bc531ad3445 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.507 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.510 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.565 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Oct  2 08:22:41 np0005466012 nova_compute[192063]: 2025-10-02 12:22:41.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.224 2 DEBUG nova.network.neutron [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updated VIF entry in instance network info cache for port fd508257-51ca-4c61-9340-029f9a9e7a75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.225 2 DEBUG nova.network.neutron [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updating instance_info_cache with network_info: [{"id": "fd508257-51ca-4c61-9340-029f9a9e7a75", "address": "fa:16:3e:5e:8b:77", "network": {"id": "043fc82b-ca25-47f8-a78d-d7118d064ecd", "bridge": "br-int", "label": "tempest-network-smoke--1375280567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd508257-51", "ovs_interfaceid": "fd508257-51ca-4c61-9340-029f9a9e7a75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.256 2 DEBUG oslo_concurrency.lockutils [req-1622a741-c258-45a6-a7e1-035f5ed1fd86 req-b60a2db2-6751-4b76-baaf-7311dc394ea9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.577 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.607 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.607 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.742 2 DEBUG nova.network.neutron [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.828 2 INFO nova.compute.manager [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Took 1.48 seconds to deallocate network for instance.#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.842 2 DEBUG nova.compute.manager [req-dd8c6b42-89d4-46af-9856-69691ece6d98 req-cfab9459-600f-4d40-be15-66251a001c65 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-deleted-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.916 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.916 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.982 2 DEBUG nova.compute.provider_tree [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:22:42 np0005466012 nova_compute[192063]: 2025-10-02 12:22:42.996 2 DEBUG nova.scheduler.client.report [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.016 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.042 2 INFO nova.scheduler.client.report [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Deleted allocations for instance 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.135 2 DEBUG oslo_concurrency.lockutils [None req-b582851e-1d77-4842-a90d-dbb440ea47ca 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.504 2 DEBUG nova.compute.manager [req-5f9f245b-1744-4d34-97f0-65c52bdb87f3 req-657b21ec-32ae-4650-a394-20fa8fa19f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.504 2 DEBUG oslo_concurrency.lockutils [req-5f9f245b-1744-4d34-97f0-65c52bdb87f3 req-657b21ec-32ae-4650-a394-20fa8fa19f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.505 2 DEBUG oslo_concurrency.lockutils [req-5f9f245b-1744-4d34-97f0-65c52bdb87f3 req-657b21ec-32ae-4650-a394-20fa8fa19f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.505 2 DEBUG oslo_concurrency.lockutils [req-5f9f245b-1744-4d34-97f0-65c52bdb87f3 req-657b21ec-32ae-4650-a394-20fa8fa19f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2eb08e64-4af9-4c5f-9817-b24d5e5ccce2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.505 2 DEBUG nova.compute.manager [req-5f9f245b-1744-4d34-97f0-65c52bdb87f3 req-657b21ec-32ae-4650-a394-20fa8fa19f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] No waiting events found dispatching network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.505 2 WARNING nova.compute.manager [req-5f9f245b-1744-4d34-97f0-65c52bdb87f3 req-657b21ec-32ae-4650-a394-20fa8fa19f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Received unexpected event network-vif-plugged-fd508257-51ca-4c61-9340-029f9a9e7a75 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.797 2 INFO nova.compute.manager [None req-ab17a875-71a1-4b70-ae3a-293292549c09 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Unpausing#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.798 2 DEBUG nova.objects.instance [None req-ab17a875-71a1-4b70-ae3a-293292549c09 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'flavor' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.831 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407763.831001, ae56113d-001e-4f10-9236-c07fe5146d9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.831 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:22:43 np0005466012 virtqemud[191783]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.835 2 DEBUG nova.virt.libvirt.guest [None req-ab17a875-71a1-4b70-ae3a-293292549c09 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.836 2 DEBUG nova.compute.manager [None req-ab17a875-71a1-4b70-ae3a-293292549c09 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.856 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.859 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:22:43 np0005466012 nova_compute[192063]: 2025-10-02 12:22:43.883 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Oct  2 08:22:44 np0005466012 nova_compute[192063]: 2025-10-02 12:22:44.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:46 np0005466012 nova_compute[192063]: 2025-10-02 12:22:46.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:46 np0005466012 nova_compute[192063]: 2025-10-02 12:22:46.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:47Z|00421|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:22:47 np0005466012 nova_compute[192063]: 2025-10-02 12:22:47.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:51 np0005466012 nova_compute[192063]: 2025-10-02 12:22:51.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:22:51Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:22:51 np0005466012 nova_compute[192063]: 2025-10-02 12:22:51.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:54 np0005466012 podman[236783]: 2025-10-02 12:22:54.158099886 +0000 UTC m=+0.073051814 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:22:54 np0005466012 podman[236784]: 2025-10-02 12:22:54.169412869 +0000 UTC m=+0.082037451 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:22:55 np0005466012 nova_compute[192063]: 2025-10-02 12:22:55.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:56 np0005466012 nova_compute[192063]: 2025-10-02 12:22:56.194 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407761.1933327, 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:22:56 np0005466012 nova_compute[192063]: 2025-10-02 12:22:56.194 2 INFO nova.compute.manager [-] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:22:56 np0005466012 nova_compute[192063]: 2025-10-02 12:22:56.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:56 np0005466012 nova_compute[192063]: 2025-10-02 12:22:56.358 2 DEBUG nova.compute.manager [None req-ee33958d-0d57-4806-a57c-02cfa0853695 - - - - - -] [instance: 2eb08e64-4af9-4c5f-9817-b24d5e5ccce2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:22:56 np0005466012 nova_compute[192063]: 2025-10-02 12:22:56.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:22:58 np0005466012 podman[236833]: 2025-10-02 12:22:58.137443303 +0000 UTC m=+0.054244957 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:23:00 np0005466012 podman[236852]: 2025-10-02 12:23:00.138412934 +0000 UTC m=+0.052202479 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:23:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:01.170 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:01.171 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:23:01 np0005466012 nova_compute[192063]: 2025-10-02 12:23:01.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:01 np0005466012 nova_compute[192063]: 2025-10-02 12:23:01.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:01 np0005466012 nova_compute[192063]: 2025-10-02 12:23:01.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:02.134 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:03.174 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:06 np0005466012 podman[236873]: 2025-10-02 12:23:06.17376609 +0000 UTC m=+0.080119366 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal)
Oct  2 08:23:06 np0005466012 podman[236872]: 2025-10-02 12:23:06.190913719 +0000 UTC m=+0.096340829 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.250 2 DEBUG oslo_concurrency.lockutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.251 2 DEBUG oslo_concurrency.lockutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.251 2 INFO nova.compute.manager [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Rebooting instance#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.371 2 DEBUG oslo_concurrency.lockutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.371 2 DEBUG oslo_concurrency.lockutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.372 2 DEBUG nova.network.neutron [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:23:06 np0005466012 nova_compute[192063]: 2025-10-02 12:23:06.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:08 np0005466012 nova_compute[192063]: 2025-10-02 12:23:08.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:09 np0005466012 nova_compute[192063]: 2025-10-02 12:23:09.833 2 DEBUG nova.network.neutron [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:23:09 np0005466012 nova_compute[192063]: 2025-10-02 12:23:09.891 2 DEBUG oslo_concurrency.lockutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.079 2 DEBUG nova.compute.manager [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:10 np0005466012 kernel: tapd1031883-21 (unregistering): left promiscuous mode
Oct  2 08:23:10 np0005466012 NetworkManager[51207]: <info>  [1759407790.5995] device (tapd1031883-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:10Z|00422|binding|INFO|Releasing lport d1031883-2135-4183-8a9d-0609c32ad14b from this chassis (sb_readonly=0)
Oct  2 08:23:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:10Z|00423|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b down in Southbound
Oct  2 08:23:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:10Z|00424|binding|INFO|Removing iface tapd1031883-21 ovn-installed in OVS
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:10.633 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.248', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:10.634 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f unbound from our chassis#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:10.636 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a04f937a-375f-4fb0-90fe-5f514a88668f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:23:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:10.637 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f34a32d8-f819-4bcf-8fa9-585a82ffc52f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:10.637 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace which is not needed anymore#033[00m
Oct  2 08:23:10 np0005466012 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Oct  2 08:23:10 np0005466012 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006f.scope: Consumed 14.419s CPU time.
Oct  2 08:23:10 np0005466012 systemd-machined[152114]: Machine qemu-52-instance-0000006f terminated.
Oct  2 08:23:10 np0005466012 podman[236915]: 2025-10-02 12:23:10.698173239 +0000 UTC m=+0.056558713 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:23:10 np0005466012 podman[236914]: 2025-10-02 12:23:10.703003506 +0000 UTC m=+0.074166395 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:23:10 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [NOTICE]   (236630) : haproxy version is 2.8.14-c23fe91
Oct  2 08:23:10 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [NOTICE]   (236630) : path to executable is /usr/sbin/haproxy
Oct  2 08:23:10 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [WARNING]  (236630) : Exiting Master process...
Oct  2 08:23:10 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [ALERT]    (236630) : Current worker (236632) exited with code 143 (Terminated)
Oct  2 08:23:10 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[236613]: [WARNING]  (236630) : All workers exited. Exiting... (0)
Oct  2 08:23:10 np0005466012 systemd[1]: libpod-11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145.scope: Deactivated successfully.
Oct  2 08:23:10 np0005466012 podman[236975]: 2025-10-02 12:23:10.778116829 +0000 UTC m=+0.043664936 container died 11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:23:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145-userdata-shm.mount: Deactivated successfully.
Oct  2 08:23:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay-a2dd58116323318ec123f316c706664610a30789b7c9596b2048b855ae155b34-merged.mount: Deactivated successfully.
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.856 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance destroyed successfully.#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.857 2 DEBUG nova.objects.instance [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'resources' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.909 2 DEBUG nova.virt.libvirt.vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:23:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.910 2 DEBUG nova.network.os_vif_util [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.911 2 DEBUG nova.network.os_vif_util [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.911 2 DEBUG os_vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.913 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1031883-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:10 np0005466012 podman[236975]: 2025-10-02 12:23:10.920903461 +0000 UTC m=+0.186451568 container cleanup 11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:10 np0005466012 systemd[1]: libpod-conmon-11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145.scope: Deactivated successfully.
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.961 2 INFO os_vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.969 2 DEBUG nova.virt.libvirt.driver [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Start _get_guest_xml network_info=[{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.972 2 WARNING nova.virt.libvirt.driver [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.977 2 DEBUG nova.virt.libvirt.host [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.977 2 DEBUG nova.virt.libvirt.host [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.980 2 DEBUG nova.virt.libvirt.host [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.981 2 DEBUG nova.virt.libvirt.host [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.982 2 DEBUG nova.virt.libvirt.driver [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.982 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.982 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.983 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.983 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.983 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.983 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.984 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.984 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.984 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.984 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.985 2 DEBUG nova.virt.hardware [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:23:10 np0005466012 nova_compute[192063]: 2025-10-02 12:23:10.985 2 DEBUG nova.objects.instance [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.034 2 DEBUG nova.virt.libvirt.vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:23:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.035 2 DEBUG nova.network.os_vif_util [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.035 2 DEBUG nova.network.os_vif_util [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.037 2 DEBUG nova.objects.instance [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'pci_devices' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:11 np0005466012 podman[237020]: 2025-10-02 12:23:11.054916432 +0000 UTC m=+0.067347271 container remove 11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.065 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c662d51f-7edb-49ce-bdaf-d75583ab0700]: (4, ('Thu Oct  2 12:23:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145)\n11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145\nThu Oct  2 12:23:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145)\n11d9cfa98aca41b25836a9f1ac29d6341d59134ee91918aabb536fce711a4145\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.067 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6f99333c-3696-4d4b-838d-b88915ca55e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.068 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 kernel: tapa04f937a-30: left promiscuous mode
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.077 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ad157ce2-8cbe-4c02-9e0b-a9bf8ce9d2bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.085 2 DEBUG nova.virt.libvirt.driver [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <uuid>ae56113d-001e-4f10-9236-c07fe5146d9c</uuid>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <name>instance-0000006f</name>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerActionsTestJSON-server-161503604</nova:name>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:23:10</nova:creationTime>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:user uuid="d54b1826121b47caba89932a78c06ccd">tempest-ServerActionsTestJSON-1646745100-project-member</nova:user>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:project uuid="e564a4cad5d443dba81ec04d2a05ced9">tempest-ServerActionsTestJSON-1646745100</nova:project>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        <nova:port uuid="d1031883-2135-4183-8a9d-0609c32ad14b">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <entry name="serial">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <entry name="uuid">ae56113d-001e-4f10-9236-c07fe5146d9c</entry>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0a:b9:ae"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <target dev="tapd1031883-21"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/console.log" append="off"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:23:11 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:23:11 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:23:11 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:23:11 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.087 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.119 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd927f9-6b07-4973-a2eb-d79f43f88ad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.121 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6071d8-b8d6-4165-b35f-e2ee4f50f7fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.143 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[be39077e-7d71-41b6-ae83-0b801d5c002b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574855, 'reachable_time': 28544, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237036, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.146 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.146 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[4303ccd7-32f8-4660-b5f1-0598549a90eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 systemd[1]: run-netns-ovnmeta\x2da04f937a\x2d375f\x2d4fb0\x2d90fe\x2d5f514a88668f.mount: Deactivated successfully.
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.151 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.152 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.205 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.207 2 DEBUG nova.objects.instance [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.231 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.306 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.307 2 DEBUG nova.virt.disk.api [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Checking if we can resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.308 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.363 2 DEBUG oslo_concurrency.processutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.364 2 DEBUG nova.virt.disk.api [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Cannot resize image /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.365 2 DEBUG nova.objects.instance [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'migration_context' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.411 2 DEBUG nova.virt.libvirt.vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:23:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.411 2 DEBUG nova.network.os_vif_util [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.412 2 DEBUG nova.network.os_vif_util [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.412 2 DEBUG os_vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.413 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.414 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.416 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1031883-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.416 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1031883-21, col_values=(('external_ids', {'iface-id': 'd1031883-2135-4183-8a9d-0609c32ad14b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:b9:ae', 'vm-uuid': 'ae56113d-001e-4f10-9236-c07fe5146d9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.4189] manager: (tapd1031883-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.427 2 INFO os_vif [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:23:11 np0005466012 kernel: tapd1031883-21: entered promiscuous mode
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.5081] manager: (tapd1031883-21): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Oct  2 08:23:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:11Z|00425|binding|INFO|Claiming lport d1031883-2135-4183-8a9d-0609c32ad14b for this chassis.
Oct  2 08:23:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:11Z|00426|binding|INFO|d1031883-2135-4183-8a9d-0609c32ad14b: Claiming fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 systemd-udevd[236940]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:23:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:11Z|00427|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b ovn-installed in OVS
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:11Z|00428|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b up in Southbound
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.532 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.248', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.534 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f bound to our chassis#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.535 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a04f937a-375f-4fb0-90fe-5f514a88668f#033[00m
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.5415] device (tapd1031883-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.5429] device (tapd1031883-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.548 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b281c2a2-06d9-43f3-ab73-95c33e662266]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.549 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa04f937a-31 in ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.551 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa04f937a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.551 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b0618fe8-1e7a-4e4f-900d-7cacea47bf01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.551 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[79e7657f-590e-480f-a535-0feac7517f6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 systemd-machined[152114]: New machine qemu-53-instance-0000006f.
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.565 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[dfdbfc8a-90bd-4d82-8e07-ca1deb147192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 systemd[1]: Started Virtual Machine qemu-53-instance-0000006f.
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.592 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0543c0-9f00-48b7-b75f-e87bd8a2db75]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.621 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[33218d94-d1ba-4ed1-aaae-5498d7ebbbfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.628 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[53b737a0-1dfe-4ff7-b0b6-54c2980e2664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.6297] manager: (tapa04f937a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.665 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1521891e-4bf0-45c2-9e4b-f4a5360bd59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.669 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4b91f1b6-c1f0-4653-b923-bac9b17cc0ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.6932] device (tapa04f937a-30): carrier: link connected
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.697 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[54957289-3618-42ef-85c8-cbf1efbeaecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.714 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4569eaf4-e7b2-41c5-9138-8338524a9b14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578530, 'reachable_time': 18408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237096, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.730 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[852b598d-24cd-433e-8900-92cccae39513]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:9368'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578530, 'tstamp': 578530}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237097, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.747 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[70ecf269-618e-4916-8663-9cefcb1fd0d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa04f937a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:93:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578530, 'reachable_time': 18408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237098, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.776 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd9f65a-f8de-4efc-996b-06d33c6688e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.828 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c7b609-a487-48f9-becc-0ef80174e4ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.829 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.830 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.830 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa04f937a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 kernel: tapa04f937a-30: entered promiscuous mode
Oct  2 08:23:11 np0005466012 NetworkManager[51207]: <info>  [1759407791.8327] manager: (tapa04f937a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.835 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa04f937a-30, col_values=(('external_ids', {'iface-id': '38f1ac16-18c6-4b4a-b769-ebc7dd5181d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:23:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:11Z|00429|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.852 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.852 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[632c8d7e-8f81-480c-9a2f-acf9e2282cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.853 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/a04f937a-375f-4fb0-90fe-5f514a88668f.pid.haproxy
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID a04f937a-375f-4fb0-90fe-5f514a88668f
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:23:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:23:11.854 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'env', 'PROCESS_TAG=haproxy-a04f937a-375f-4fb0-90fe-5f514a88668f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a04f937a-375f-4fb0-90fe-5f514a88668f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:23:11 np0005466012 nova_compute[192063]: 2025-10-02 12:23:11.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:12 np0005466012 podman[237138]: 2025-10-02 12:23:12.199957265 +0000 UTC m=+0.037768988 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:23:12 np0005466012 podman[237138]: 2025-10-02 12:23:12.357638952 +0000 UTC m=+0.195450675 container create 78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:23:12 np0005466012 systemd[1]: Started libpod-conmon-78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda.scope.
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.432 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for ae56113d-001e-4f10-9236-c07fe5146d9c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.433 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407792.4320023, ae56113d-001e-4f10-9236-c07fe5146d9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.433 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.440 2 DEBUG nova.compute.manager [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:23:12 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.443 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance rebooted successfully.#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.443 2 DEBUG nova.compute.manager [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:12 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00af8bac3af959994a3c6efe95e21cd88f1b9d5d2b5200cad26210961658384b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:23:12 np0005466012 podman[237138]: 2025-10-02 12:23:12.468619507 +0000 UTC m=+0.306431220 container init 78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:23:12 np0005466012 podman[237138]: 2025-10-02 12:23:12.475604985 +0000 UTC m=+0.313416678 container start 78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.477 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.481 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:12 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [NOTICE]   (237157) : New worker (237159) forked
Oct  2 08:23:12 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [NOTICE]   (237157) : Loading success.
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.535 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407792.4394088, ae56113d-001e-4f10-9236-c07fe5146d9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.535 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Started (Lifecycle Event)#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.599 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.602 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:23:12 np0005466012 nova_compute[192063]: 2025-10-02 12:23:12.694 2 DEBUG oslo_concurrency.lockutils [None req-24ab8c65-2ec6-4d8c-8a75-158ad26df9ff d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:14 np0005466012 nova_compute[192063]: 2025-10-02 12:23:14.117 2 DEBUG nova.compute.manager [req-006dc549-78e0-4281-9336-90587c3d6c27 req-74d76292-cc8f-4cf4-877d-2fea96c51c93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:14 np0005466012 nova_compute[192063]: 2025-10-02 12:23:14.118 2 DEBUG oslo_concurrency.lockutils [req-006dc549-78e0-4281-9336-90587c3d6c27 req-74d76292-cc8f-4cf4-877d-2fea96c51c93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:14 np0005466012 nova_compute[192063]: 2025-10-02 12:23:14.118 2 DEBUG oslo_concurrency.lockutils [req-006dc549-78e0-4281-9336-90587c3d6c27 req-74d76292-cc8f-4cf4-877d-2fea96c51c93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:14 np0005466012 nova_compute[192063]: 2025-10-02 12:23:14.119 2 DEBUG oslo_concurrency.lockutils [req-006dc549-78e0-4281-9336-90587c3d6c27 req-74d76292-cc8f-4cf4-877d-2fea96c51c93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:14 np0005466012 nova_compute[192063]: 2025-10-02 12:23:14.119 2 DEBUG nova.compute.manager [req-006dc549-78e0-4281-9336-90587c3d6c27 req-74d76292-cc8f-4cf4-877d-2fea96c51c93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:14 np0005466012 nova_compute[192063]: 2025-10-02 12:23:14.120 2 WARNING nova.compute.manager [req-006dc549-78e0-4281-9336-90587c3d6c27 req-74d76292-cc8f-4cf4-877d-2fea96c51c93 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.244 2 DEBUG nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.245 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.245 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.246 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.246 2 DEBUG nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.247 2 WARNING nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.247 2 DEBUG nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.248 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.248 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.249 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.249 2 DEBUG nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.250 2 WARNING nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.250 2 DEBUG nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.250 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.251 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.251 2 DEBUG oslo_concurrency.lockutils [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.252 2 DEBUG nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.252 2 WARNING nova.compute.manager [req-572e37bc-64e5-42b1-8cf6-64be0f7e25a7 req-ecda4515-2175-4174-9153-f02d88a288f0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:16 np0005466012 nova_compute[192063]: 2025-10-02 12:23:16.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.926 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'name': 'tempest-ServerActionsTestJSON-server-161503604', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'hostId': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.928 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.928 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>]
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.929 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.929 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>]
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.933 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ae56113d-001e-4f10-9236-c07fe5146d9c / tapd1031883-21 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.934 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ba26492-0ab6-41c0-a378-7a1ccb4e192a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:16.930195', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '9304bf1e-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': 'b9412d57adb8804929daaac0aa616ed657bfde2216bc4562040386d458511522'}]}, 'timestamp': '2025-10-02 12:23:16.935375', '_unique_id': 'd8a286a260b4466eb588ba33093a59d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.936 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.939 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3b1f49c-4021-49d1-808f-54e3f4ba7194', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:16.939363', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '93057814-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': '2460604a1688419e390305b4511ac564060ec1807f89720c5be3ba1a217090f4'}]}, 'timestamp': '2025-10-02 12:23:16.940029', '_unique_id': '00159b5b9535403097a9b55ac59ba9ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.941 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.943 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.943 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f10d7e35-d43f-4fe9-ac49-3333a0b61e2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:16.943360', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '93061710-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': 'e00cbd5d5215c731f4f601208ecca875f9eb4858614bdba2b882b247fae46d71'}]}, 'timestamp': '2025-10-02 12:23:16.944106', '_unique_id': '7aa1293f1b8342a686748d21f2ef5743'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.945 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.947 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13666ab4-a95b-4e25-968f-b45bc58e0f78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:16.947540', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '9306b7a6-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': '4759efa7b5f38f700171864759f95e4774c2c51c60b06395f8bcfabb90754b2a'}]}, 'timestamp': '2025-10-02 12:23:16.948205', '_unique_id': '5f40fc71685e407e9acf0e508c8beddb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.949 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.971 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.972 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c4f33db-de93-42b8-9c81-dc64ce4f7066', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:16.951618', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930a77c4-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.62807041, 'message_signature': 'caffbdd6a7e5a70da703438d78bc58c7cab2f7fd61c9cfe311387a5bdf8822d7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:16.951618', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '930a9560-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.62807041, 'message_signature': 'affa94bcf543a7e8d2f6b3cccc4bcbe767ea9a07a030c3899ede36c7639ea04f'}]}, 'timestamp': '2025-10-02 12:23:16.973544', '_unique_id': 'fb0ccba2ac904867a18308231a9e69ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.975 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.977 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53116b44-4827-4125-9e84-6fa3d366104c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:16.977670', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '930b550e-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': '877a9eaa5d3eea41651591985137390c4e42f26475aecb075ba835db1af11e8e'}]}, 'timestamp': '2025-10-02 12:23:16.978587', '_unique_id': '01d98c1ff16d4629b142287cff4c3896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.980 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.983 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.983 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26548d1a-e1d3-447a-ba69-fb9c9a5d4a96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:16.982972', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930c2146-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.62807041, 'message_signature': '14cc6cc52ccd8a6a95ff0afa8831ef35355ab1a8b6d3235c710af73d897b7859'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:16.982972', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '930c3f96-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.62807041, 'message_signature': '9da7dba9077bb532c52115db8a288968aec935f40028984573bad2a02e397d55'}]}, 'timestamp': '2025-10-02 12:23:16.984519', '_unique_id': '6514ddcb75d6481da4403dad51d5633f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.989 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae7715a2-9bfa-4821-ac4f-43cb3cfbaaef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:16.988937', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '930d0ac0-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': '565851d572a395be8d9bbb12193077b69165dcf3131c70cca29b71d2e8ee33dd'}]}, 'timestamp': '2025-10-02 12:23:16.989789', '_unique_id': 'fddcd35438384227965244b7334e39be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.994 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.994 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>]
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:16.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.020 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.read.latency volume: 552684110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.020 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.read.latency volume: 609827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20888d26-450c-4434-8b01-ed220e90ce57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 552684110, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:16.995368', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9311cfec-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '689ee41e194bc5ae6cf28848f38b59cb65a7a49c5dd41952d3694827d2aef6ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 609827, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:16.995368', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9311df64-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '6d6d2913b72d6d6f5ed8cfd52cce529145bb41dcc6ac0cc2c35766cfd561219a'}]}, 'timestamp': '2025-10-02 12:23:17.021211', '_unique_id': 'a061c955cfbf400eb8b9d1c2632e1e08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.023 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.023 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18c7854e-7756-4932-b44c-9385e590200f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:17.023445', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93124486-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '7f6c1fe7c880201c4679d0c86e84f36ef896b97ed0617e65382850cea93eb54d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:17.023445', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93125278-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': 'beb0d328fe80c4c15aa7cca54de6112060d7d0ea7d2b53d9d658f508b02233fe'}]}, 'timestamp': '2025-10-02 12:23:17.024210', '_unique_id': '9fdce2c0232b407295a7f7d8b873d051'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.026 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5987f806-a354-4085-84a7-b5b312014da2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:17.026230', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '9312b1a0-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': '782adb45505b5400f1a185121a817d484382687e2f36df178d8a0da1a8e9616f'}]}, 'timestamp': '2025-10-02 12:23:17.026598', '_unique_id': '7723cf1454544187a44ea21e92ed6b4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.027 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.028 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17575b35-b3ad-4686-8089-b2042149fce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:17.028663', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '9313117c-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': 'abe61b33a6bae027db1f702cc5fdae90403f7895489063795b36934d15f4a8f5'}]}, 'timestamp': '2025-10-02 12:23:17.029052', '_unique_id': '7ff49288d0c04ce09ca9d02ab5586b1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.029 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.030 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.031 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.031 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f1028c5-788b-41fd-8b31-ccdc852acac0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:17.031013', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93136c62-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '64cb7ac195744d5426c55ec6433f0e01a8b1c124556ca826c1c42fbdd7ed9fc4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:17.031013', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93137946-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '892555638d8afc6bf75ef8292c764c2b27fe543fea9381ed44aea31c4b44bf52'}]}, 'timestamp': '2025-10-02 12:23:17.031714', '_unique_id': '720af440ba5240cd935ed46b26ffdbf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.032 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.033 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.033 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.034 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18fa4d65-6a7d-415b-9315-3798095e57b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:17.033726', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9313d63e-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '706d08637c2b6a1c916c86ccdba3c3d10a615d85c81ad573caab92037de1f83c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:17.033726', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9313e3d6-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': '8f7b8d86e909526a7488e301fdb92e3c4cff9ead6f702ea60984435191c02538'}]}, 'timestamp': '2025-10-02 12:23:17.034418', '_unique_id': '70a8a3ec64374079bfce041889749b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.036 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b60cfd54-c118-4fe1-9566-350bde4ee46b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:17.036365', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '93143d68-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': 'defe19db62965a15e9323c8fafd4a3d18a40e49dcbbcb05ffe4a972e44467505'}]}, 'timestamp': '2025-10-02 12:23:17.036750', '_unique_id': '4c6a9e27f54940b291e4f5e12e26ad2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.038 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.038 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.039 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bb4fc9d-c9c8-45d6-a86d-3cd588dd7d13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:17.038740', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93149a38-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': 'eec152d257c53702f721b2f5aed408698c090fc311811226ac1085288a806565'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:17.038740', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9314a6ea-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': 'b03c03feaf93a3ee01c9fca46d68df572dddb06343c3308700dd96596cf319e7'}]}, 'timestamp': '2025-10-02 12:23:17.039413', '_unique_id': '0b167089041a42c4aee12cca1a2ac24e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.041 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.041 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aec3fd20-9148-4fc4-86e8-3027effd0ede', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:17.041448', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9315043c-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': 'f70b31c23fed8a063cf93959dc9fe8c378aa46f81e7eebc8639f8849e50018d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:17.041448', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93151350-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.671811597, 'message_signature': 'd4e5effa6231269ec65ce3545cff94689e264f6c5f4cc7baafdd03da3a33c6ce'}]}, 'timestamp': '2025-10-02 12:23:17.042192', '_unique_id': '8bf1985929a24977b674399080e3a0bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.044 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.044 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-161503604>]
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.044 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1449101-98e4-4b1e-963e-f48e1626bccb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'instance-0000006f-ae56113d-001e-4f10-9236-c07fe5146d9c-tapd1031883-21', 'timestamp': '2025-10-02T12:23:17.044935', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'tapd1031883-21', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:b9:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1031883-21'}, 'message_id': '93158c04-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.606589897, 'message_signature': '389f8580bc0cc5a3f0632892def866bd33f963555f9d4b07e7b8b95c0b6a3ac5'}]}, 'timestamp': '2025-10-02 12:23:17.045297', '_unique_id': '5a438f7309654351be7d11ea1249fd70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.047 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.064 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/cpu volume: 4420000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '016c15cc-35fb-4325-879f-77200b7faddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4420000000, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'timestamp': '2025-10-02T12:23:17.047293', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '93187eb4-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.740383582, 'message_signature': 'bf2c81ebed785090524dd4bb998cad91b4c0cf7dfd36d1e6d9123ecc0ed5629e'}]}, 'timestamp': '2025-10-02 12:23:17.064725', '_unique_id': '8592c6958981467499b188bf8ce17e56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.065 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.067 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.067 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.067 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ae56113d-001e-4f10-9236-c07fe5146d9c: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.067 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.usage volume: 30277632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.068 12 DEBUG ceilometer.compute.pollsters [-] ae56113d-001e-4f10-9236-c07fe5146d9c/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf7d7625-d97a-4c55-acb5-6b857365aeea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30277632, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-vda', 'timestamp': '2025-10-02T12:23:17.067949', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93191176-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.62807041, 'message_signature': '104b7e886b9388356d0ae4d8d6fa52c35d8361579b10c3a3135569dabdb61cf9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd54b1826121b47caba89932a78c06ccd', 'user_name': None, 'project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'project_name': None, 'resource_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c-sda', 'timestamp': '2025-10-02T12:23:17.067949', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-161503604', 'name': 'instance-0000006f', 'instance_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'instance_type': 'm1.nano', 'host': '04a56181993eee2f25ea12fbbb1859e63388bcc3c56020c1dfd83ca3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9319229c-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5790.62807041, 'message_signature': 'a3b3afaed9fc3b6487f11ea107585358474342e5d7d64fd890c3b95112739bf9'}]}, 'timestamp': '2025-10-02 12:23:17.068875', '_unique_id': '36fc8cc68dce4543911b8230ecf5a918'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:23:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:23:17.069 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:23:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:18Z|00430|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:23:18 np0005466012 nova_compute[192063]: 2025-10-02 12:23:18.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:21 np0005466012 nova_compute[192063]: 2025-10-02 12:23:21.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:21 np0005466012 nova_compute[192063]: 2025-10-02 12:23:21.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:25 np0005466012 podman[237174]: 2025-10-02 12:23:25.181968287 +0000 UTC m=+0.089754720 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:23:25 np0005466012 podman[237175]: 2025-10-02 12:23:25.184453738 +0000 UTC m=+0.092539510 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:23:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:25Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:b9:ae 10.100.0.12
Oct  2 08:23:25 np0005466012 nova_compute[192063]: 2025-10-02 12:23:25.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:26 np0005466012 nova_compute[192063]: 2025-10-02 12:23:26.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:26 np0005466012 nova_compute[192063]: 2025-10-02 12:23:26.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:27 np0005466012 nova_compute[192063]: 2025-10-02 12:23:27.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:27 np0005466012 nova_compute[192063]: 2025-10-02 12:23:27.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:27 np0005466012 nova_compute[192063]: 2025-10-02 12:23:27.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:23:28Z|00431|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:23:28 np0005466012 nova_compute[192063]: 2025-10-02 12:23:28.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:29 np0005466012 podman[237223]: 2025-10-02 12:23:29.148783018 +0000 UTC m=+0.059209719 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:23:29 np0005466012 nova_compute[192063]: 2025-10-02 12:23:29.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:29 np0005466012 nova_compute[192063]: 2025-10-02 12:23:29.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:29 np0005466012 nova_compute[192063]: 2025-10-02 12:23:29.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:23:31 np0005466012 podman[237242]: 2025-10-02 12:23:31.162207114 +0000 UTC m=+0.080556039 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.850 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.850 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.851 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:31 np0005466012 nova_compute[192063]: 2025-10-02 12:23:31.976 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.035 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.036 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.101 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.245 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.246 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5541MB free_disk=73.35768508911133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.246 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.247 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.437 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance ae56113d-001e-4f10-9236-c07fe5146d9c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.437 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.437 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.522 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.541 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.575 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:23:32 np0005466012 nova_compute[192063]: 2025-10-02 12:23:32.575 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:23:34 np0005466012 nova_compute[192063]: 2025-10-02 12:23:34.576 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:36 np0005466012 nova_compute[192063]: 2025-10-02 12:23:36.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:36 np0005466012 nova_compute[192063]: 2025-10-02 12:23:36.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:37 np0005466012 podman[237270]: 2025-10-02 12:23:37.146537795 +0000 UTC m=+0.062078541 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:23:37 np0005466012 podman[237271]: 2025-10-02 12:23:37.146553376 +0000 UTC m=+0.058947772 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Oct  2 08:23:39 np0005466012 nova_compute[192063]: 2025-10-02 12:23:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:23:39 np0005466012 nova_compute[192063]: 2025-10-02 12:23:39.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:23:39 np0005466012 nova_compute[192063]: 2025-10-02 12:23:39.848 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:23:41 np0005466012 podman[237311]: 2025-10-02 12:23:41.168936421 +0000 UTC m=+0.082911476 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:23:41 np0005466012 podman[237310]: 2025-10-02 12:23:41.169589779 +0000 UTC m=+0.078084748 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:23:41 np0005466012 nova_compute[192063]: 2025-10-02 12:23:41.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:41 np0005466012 nova_compute[192063]: 2025-10-02 12:23:41.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:46 np0005466012 nova_compute[192063]: 2025-10-02 12:23:46.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:46 np0005466012 nova_compute[192063]: 2025-10-02 12:23:46.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:49 np0005466012 nova_compute[192063]: 2025-10-02 12:23:49.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:51 np0005466012 nova_compute[192063]: 2025-10-02 12:23:51.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:51 np0005466012 nova_compute[192063]: 2025-10-02 12:23:51.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:54 np0005466012 nova_compute[192063]: 2025-10-02 12:23:54.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:56 np0005466012 podman[237352]: 2025-10-02 12:23:56.135438452 +0000 UTC m=+0.056659236 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:23:56 np0005466012 podman[237353]: 2025-10-02 12:23:56.178666925 +0000 UTC m=+0.096903504 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:23:56 np0005466012 nova_compute[192063]: 2025-10-02 12:23:56.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:23:56 np0005466012 nova_compute[192063]: 2025-10-02 12:23:56.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:00 np0005466012 podman[237402]: 2025-10-02 12:24:00.135907561 +0000 UTC m=+0.055118202 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:24:01 np0005466012 nova_compute[192063]: 2025-10-02 12:24:01.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:01 np0005466012 nova_compute[192063]: 2025-10-02 12:24:01.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:02.136 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:02 np0005466012 podman[237422]: 2025-10-02 12:24:02.150237602 +0000 UTC m=+0.065886529 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:24:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:02.647 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:02.649 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.701 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.701 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.702 2 INFO nova.compute.manager [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Unshelving#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.839 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.840 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.843 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'pci_requests' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.855 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'numa_topology' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.866 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.867 2 INFO nova.compute.claims [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.984 2 DEBUG nova.compute.provider_tree [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:02 np0005466012 nova_compute[192063]: 2025-10-02 12:24:02.998 2 DEBUG nova.scheduler.client.report [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:03 np0005466012 nova_compute[192063]: 2025-10-02 12:24:03.023 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:03 np0005466012 nova_compute[192063]: 2025-10-02 12:24:03.392 2 INFO nova.network.neutron [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Updating port f1306fa9-9429-43db-a3f4-48a2399611d7 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  2 08:24:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:04.653 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:05 np0005466012 nova_compute[192063]: 2025-10-02 12:24:05.522 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "refresh_cache-ae6bf863-8cca-48ab-a98f-065f8382fa99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:05 np0005466012 nova_compute[192063]: 2025-10-02 12:24:05.522 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquired lock "refresh_cache-ae6bf863-8cca-48ab-a98f-065f8382fa99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:05 np0005466012 nova_compute[192063]: 2025-10-02 12:24:05.523 2 DEBUG nova.network.neutron [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:24:06 np0005466012 nova_compute[192063]: 2025-10-02 12:24:06.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:06 np0005466012 nova_compute[192063]: 2025-10-02 12:24:06.748 2 DEBUG nova.compute.manager [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-changed-f1306fa9-9429-43db-a3f4-48a2399611d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:06 np0005466012 nova_compute[192063]: 2025-10-02 12:24:06.748 2 DEBUG nova.compute.manager [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Refreshing instance network info cache due to event network-changed-f1306fa9-9429-43db-a3f4-48a2399611d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:06 np0005466012 nova_compute[192063]: 2025-10-02 12:24:06.749 2 DEBUG oslo_concurrency.lockutils [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ae6bf863-8cca-48ab-a98f-065f8382fa99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:06 np0005466012 nova_compute[192063]: 2025-10-02 12:24:06.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:08 np0005466012 podman[237442]: 2025-10-02 12:24:08.139024572 +0000 UTC m=+0.055827763 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct  2 08:24:08 np0005466012 podman[237443]: 2025-10-02 12:24:08.170903371 +0000 UTC m=+0.079234361 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal)
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.729 2 DEBUG nova.network.neutron [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Updating instance_info_cache with network_info: [{"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.753 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Releasing lock "refresh_cache-ae6bf863-8cca-48ab-a98f-065f8382fa99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.755 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.755 2 INFO nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Creating image(s)#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.756 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "/var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.756 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.756 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "/var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.757 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.757 2 DEBUG oslo_concurrency.lockutils [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ae6bf863-8cca-48ab-a98f-065f8382fa99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.758 2 DEBUG nova.network.neutron [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Refreshing network info cache for port f1306fa9-9429-43db-a3f4-48a2399611d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.778 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:08 np0005466012 nova_compute[192063]: 2025-10-02 12:24:08.778 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.489 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.551 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.part --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.552 2 DEBUG nova.virt.images [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] a84a17ed-71a4-4591-9f9d-d22a239469b6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.554 2 DEBUG nova.privsep.utils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.555 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.part /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.844 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.part /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.converted" returned: 0 in 0.289s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.855 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.914 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.916 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.932 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.989 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.990 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:10 np0005466012 nova_compute[192063]: 2025-10-02 12:24:10.991 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.006 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.074 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.076 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d,backing_fmt=raw /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.112 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d,backing_fmt=raw /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.113 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.113 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.169 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.170 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'migration_context' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.205 2 INFO nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Rebasing disk image.#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.206 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.264 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.265 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:11 np0005466012 nova_compute[192063]: 2025-10-02 12:24:11.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.125 2 DEBUG nova.network.neutron [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Updated VIF entry in instance network info cache for port f1306fa9-9429-43db-a3f4-48a2399611d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.126 2 DEBUG nova.network.neutron [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Updating instance_info_cache with network_info: [{"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:12 np0005466012 podman[237514]: 2025-10-02 12:24:12.13206082 +0000 UTC m=+0.048810293 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:24:12 np0005466012 podman[237513]: 2025-10-02 12:24:12.147623664 +0000 UTC m=+0.068614308 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.161 2 DEBUG oslo_concurrency.lockutils [req-6f2fc0a3-994f-43b9-978e-159571420790 req-3c78d29b-8049-4ac3-b6d9-66cdbc20c54f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ae6bf863-8cca-48ab-a98f-065f8382fa99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.631 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 -F raw /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk" returned: 0 in 1.367s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.632 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.633 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Ensure instance console log exists: /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.634 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.634 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.634 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.637 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Start _get_guest_xml network_info=[{"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='a093d04a7dad0344ae1e8363603c7876',container_format='bare',created_at=2025-10-02T12:23:45Z,direct_url=<?>,disk_format='qcow2',id=a84a17ed-71a4-4591-9f9d-d22a239469b6,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1769053978-shelved',owner='ffce7d629aa24a7f970d93b2a79045f1',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-10-02T12:23:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.642 2 WARNING nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.649 2 DEBUG nova.virt.libvirt.host [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.650 2 DEBUG nova.virt.libvirt.host [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.653 2 DEBUG nova.virt.libvirt.host [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.654 2 DEBUG nova.virt.libvirt.host [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.656 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.656 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='a093d04a7dad0344ae1e8363603c7876',container_format='bare',created_at=2025-10-02T12:23:45Z,direct_url=<?>,disk_format='qcow2',id=a84a17ed-71a4-4591-9f9d-d22a239469b6,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1769053978-shelved',owner='ffce7d629aa24a7f970d93b2a79045f1',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-10-02T12:23:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.656 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.657 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.657 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.657 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.658 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.658 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.658 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.659 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.659 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.659 2 DEBUG nova.virt.hardware [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.659 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.681 2 DEBUG nova.virt.libvirt.vif [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:22:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1769053978',display_name='tempest-ServerActionsTestOtherB-server-1769053978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1769053978',id=114,image_ref='a84a17ed-71a4-4591-9f9d-d22a239469b6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1900171990',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:22:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-flcxdim8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member',shelved_at='2025-10-02T12:23:53.063627',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='a84a17ed-71a4-4591-9f9d-d22a239469b6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:24:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ae6bf863-8cca-48ab-a98f-065f8382fa99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.682 2 DEBUG nova.network.os_vif_util [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.683 2 DEBUG nova.network.os_vif_util [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.684 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.696 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <uuid>ae6bf863-8cca-48ab-a98f-065f8382fa99</uuid>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <name>instance-00000072</name>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerActionsTestOtherB-server-1769053978</nova:name>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:24:12</nova:creationTime>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:user uuid="0ea122e2fff94f2ba7c78bf30b04029c">tempest-ServerActionsTestOtherB-263921372-project-member</nova:user>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:project uuid="ffce7d629aa24a7f970d93b2a79045f1">tempest-ServerActionsTestOtherB-263921372</nova:project>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="a84a17ed-71a4-4591-9f9d-d22a239469b6"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        <nova:port uuid="f1306fa9-9429-43db-a3f4-48a2399611d7">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <entry name="serial">ae6bf863-8cca-48ab-a98f-065f8382fa99</entry>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <entry name="uuid">ae6bf863-8cca-48ab-a98f-065f8382fa99</entry>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.config"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:41:ec:88"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <target dev="tapf1306fa9-94"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/console.log" append="off"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <input type="keyboard" bus="usb"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:24:12 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:24:12 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:24:12 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:24:12 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.697 2 DEBUG nova.compute.manager [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Preparing to wait for external event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.697 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.697 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.698 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.698 2 DEBUG nova.virt.libvirt.vif [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:22:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1769053978',display_name='tempest-ServerActionsTestOtherB-server-1769053978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1769053978',id=114,image_ref='a84a17ed-71a4-4591-9f9d-d22a239469b6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1900171990',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:22:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-flcxdim8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member',shelved_at='2025-10-02T12:23:53.063627',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='a84a17ed-71a4-4591-9f9d-d22a239469b6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:24:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ae6bf863-8cca-48ab-a98f-065f8382fa99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.699 2 DEBUG nova.network.os_vif_util [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.700 2 DEBUG nova.network.os_vif_util [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.700 2 DEBUG os_vif [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.702 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.704 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1306fa9-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf1306fa9-94, col_values=(('external_ids', {'iface-id': 'f1306fa9-9429-43db-a3f4-48a2399611d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:ec:88', 'vm-uuid': 'ae6bf863-8cca-48ab-a98f-065f8382fa99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:12 np0005466012 NetworkManager[51207]: <info>  [1759407852.7440] manager: (tapf1306fa9-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.753 2 INFO os_vif [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94')#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.938 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.938 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.939 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] No VIF found with MAC fa:16:3e:41:ec:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.940 2 INFO nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Using config drive#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.955 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:12 np0005466012 nova_compute[192063]: 2025-10-02 12:24:12.991 2 DEBUG nova.objects.instance [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'keypairs' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:13 np0005466012 nova_compute[192063]: 2025-10-02 12:24:13.675 2 INFO nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Creating config drive at /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.config#033[00m
Oct  2 08:24:13 np0005466012 nova_compute[192063]: 2025-10-02 12:24:13.681 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5bhhjksb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:13 np0005466012 nova_compute[192063]: 2025-10-02 12:24:13.810 2 DEBUG oslo_concurrency.processutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5bhhjksb" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:13 np0005466012 kernel: tapf1306fa9-94: entered promiscuous mode
Oct  2 08:24:13 np0005466012 nova_compute[192063]: 2025-10-02 12:24:13.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:13Z|00432|binding|INFO|Claiming lport f1306fa9-9429-43db-a3f4-48a2399611d7 for this chassis.
Oct  2 08:24:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:13Z|00433|binding|INFO|f1306fa9-9429-43db-a3f4-48a2399611d7: Claiming fa:16:3e:41:ec:88 10.100.0.8
Oct  2 08:24:13 np0005466012 NetworkManager[51207]: <info>  [1759407853.8927] manager: (tapf1306fa9-94): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Oct  2 08:24:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:13Z|00434|binding|INFO|Setting lport f1306fa9-9429-43db-a3f4-48a2399611d7 ovn-installed in OVS
Oct  2 08:24:13 np0005466012 nova_compute[192063]: 2025-10-02 12:24:13.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:13 np0005466012 nova_compute[192063]: 2025-10-02 12:24:13.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.911 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:ec:88 10.100.0.8'], port_security=['fa:16:3e:41:ec:88 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ae6bf863-8cca-48ab-a98f-065f8382fa99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '7', 'neutron:security_group_ids': '12e9168a-be86-462f-a658-971f38e3430f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=f1306fa9-9429-43db-a3f4-48a2399611d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.912 103246 INFO neutron.agent.ovn.metadata.agent [-] Port f1306fa9-9429-43db-a3f4-48a2399611d7 in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 bound to our chassis#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.914 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20eb29be-ee23-463b-85af-bfc2388e9f77#033[00m
Oct  2 08:24:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:13Z|00435|binding|INFO|Setting lport f1306fa9-9429-43db-a3f4-48a2399611d7 up in Southbound
Oct  2 08:24:13 np0005466012 systemd-udevd[237572]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.930 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[af4386df-9687-4eba-ad33-096a74a65482]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.931 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20eb29be-e1 in ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.933 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20eb29be-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.933 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[53585c2c-3254-47d3-9d23-2f1929208580]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.934 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc1520a-f971-432e-a3ed-95c8f552b98e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:13 np0005466012 NetworkManager[51207]: <info>  [1759407853.9357] device (tapf1306fa9-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:24:13 np0005466012 NetworkManager[51207]: <info>  [1759407853.9363] device (tapf1306fa9-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:24:13 np0005466012 systemd-machined[152114]: New machine qemu-54-instance-00000072.
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.947 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[9376643f-3fd4-461b-969d-a7a612892a10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:13 np0005466012 systemd[1]: Started Virtual Machine qemu-54-instance-00000072.
Oct  2 08:24:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:13.976 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[180a5654-fad5-4843-8658-8771c706257c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.003 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0362b790-7615-4306-8405-c2a5c9d9f176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 NetworkManager[51207]: <info>  [1759407854.0092] manager: (tap20eb29be-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/197)
Oct  2 08:24:14 np0005466012 systemd-udevd[237579]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.011 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[42d47dee-2132-4954-9c5e-70e0df2d25c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.047 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[53f2a746-af64-4471-b069-08fe03e098aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.050 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[47c9ae13-ac9b-42e1-971a-c4d71cf5af23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 NetworkManager[51207]: <info>  [1759407854.0850] device (tap20eb29be-e0): carrier: link connected
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.094 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[93c206e1-b68d-41d8-9c82-6a619cb571dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.115 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[941a7e1b-107f-4e26-89aa-b68a5251baf8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584769, 'reachable_time': 38299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237608, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.139 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[42fa3a31-761b-4839-bb17-26616fd67161]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:5596'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 584769, 'tstamp': 584769}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237609, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.159 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6b649a57-e6f8-4eee-b863-a824ce579001]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20eb29be-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584769, 'reachable_time': 38299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237610, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.213 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d80a07dc-33af-4a2a-8e40-ca8ed2f20085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.283 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8e17ec1c-0948-48c0-8977-acc5fa6a44ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.285 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.286 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.286 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20eb29be-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005466012 NetworkManager[51207]: <info>  [1759407854.2892] manager: (tap20eb29be-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Oct  2 08:24:14 np0005466012 kernel: tap20eb29be-e0: entered promiscuous mode
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.291 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20eb29be-e0, col_values=(('external_ids', {'iface-id': 'e533861f-45cb-4843-b071-0b628ca25128'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:14Z|00436|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.294 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20eb29be-ee23-463b-85af-bfc2388e9f77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20eb29be-ee23-463b-85af-bfc2388e9f77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.295 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1c0a1c-9bac-4a25-8ed4-093fef33bd7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.296 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-20eb29be-ee23-463b-85af-bfc2388e9f77
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/20eb29be-ee23-463b-85af-bfc2388e9f77.pid.haproxy
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 20eb29be-ee23-463b-85af-bfc2388e9f77
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:24:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:14.297 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'env', 'PROCESS_TAG=haproxy-20eb29be-ee23-463b-85af-bfc2388e9f77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20eb29be-ee23-463b-85af-bfc2388e9f77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.638 2 DEBUG nova.compute.manager [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.638 2 DEBUG oslo_concurrency.lockutils [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.639 2 DEBUG oslo_concurrency.lockutils [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.639 2 DEBUG oslo_concurrency.lockutils [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.639 2 DEBUG nova.compute.manager [req-1a9dc7a2-5570-431b-bd9f-5e166ac9da7d req-f6a9cc67-393c-4417-b460-2446927d3267 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Processing event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:24:14 np0005466012 podman[237649]: 2025-10-02 12:24:14.735382968 +0000 UTC m=+0.072145978 container create a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.768 2 DEBUG nova.compute.manager [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.769 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407854.7679513, ae6bf863-8cca-48ab-a98f-065f8382fa99 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.769 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] VM Started (Lifecycle Event)#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.772 2 DEBUG nova.virt.libvirt.driver [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.776 2 INFO nova.virt.libvirt.driver [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Instance spawned successfully.#033[00m
Oct  2 08:24:14 np0005466012 podman[237649]: 2025-10-02 12:24:14.689301994 +0000 UTC m=+0.026065024 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:24:14 np0005466012 systemd[1]: Started libpod-conmon-a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682.scope.
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.791 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.795 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.811 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.812 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407854.7690668, ae6bf863-8cca-48ab-a98f-065f8382fa99 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.812 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:24:14 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:24:14 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08e10f6517d28fe491832b65dd97ac75ccc46dedf98199e912f524bc9e4f776c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.835 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:14 np0005466012 podman[237649]: 2025-10-02 12:24:14.841293929 +0000 UTC m=+0.178056959 container init a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.842 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407854.7722108, ae6bf863-8cca-48ab-a98f-065f8382fa99 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.842 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:24:14 np0005466012 podman[237649]: 2025-10-02 12:24:14.847790274 +0000 UTC m=+0.184553284 container start a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.859 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:14 np0005466012 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[237665]: [NOTICE]   (237669) : New worker (237671) forked
Oct  2 08:24:14 np0005466012 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[237665]: [NOTICE]   (237669) : Loading success.
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.873 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:14 np0005466012 nova_compute[192063]: 2025-10-02 12:24:14.899 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:24:15 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:15Z|00437|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:24:15 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:15Z|00438|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:24:15 np0005466012 nova_compute[192063]: 2025-10-02 12:24:15.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:15 np0005466012 nova_compute[192063]: 2025-10-02 12:24:15.730 2 DEBUG nova.compute.manager [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.716 2 DEBUG oslo_concurrency.lockutils [None req-b2d2b8ca-f7df-4cc5-b910-e4b9be82d110 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 14.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.751 2 DEBUG nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.751 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.752 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.752 2 DEBUG oslo_concurrency.lockutils [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.752 2 DEBUG nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] No waiting events found dispatching network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.752 2 WARNING nova.compute.manager [req-fa28a159-c799-4d23-a7b3-1b109e13166a req-13cd90e2-b93c-4ff4-a0b5-204e2025cc01 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received unexpected event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:24:16 np0005466012 nova_compute[192063]: 2025-10-02 12:24:16.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:17 np0005466012 nova_compute[192063]: 2025-10-02 12:24:17.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:20Z|00439|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:24:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:20Z|00440|binding|INFO|Releasing lport e533861f-45cb-4843-b071-0b628ca25128 from this chassis (sb_readonly=0)
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.312 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.313 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.313 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.313 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.314 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.323 2 INFO nova.compute.manager [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Terminating instance#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.337 2 DEBUG nova.compute.manager [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:24:20 np0005466012 kernel: tapf1306fa9-94 (unregistering): left promiscuous mode
Oct  2 08:24:20 np0005466012 NetworkManager[51207]: <info>  [1759407860.3621] device (tapf1306fa9-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:20Z|00441|binding|INFO|Releasing lport f1306fa9-9429-43db-a3f4-48a2399611d7 from this chassis (sb_readonly=0)
Oct  2 08:24:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:20Z|00442|binding|INFO|Setting lport f1306fa9-9429-43db-a3f4-48a2399611d7 down in Southbound
Oct  2 08:24:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:20Z|00443|binding|INFO|Removing iface tapf1306fa9-94 ovn-installed in OVS
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.379 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:ec:88 10.100.0.8'], port_security=['fa:16:3e:41:ec:88 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ae6bf863-8cca-48ab-a98f-065f8382fa99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20eb29be-ee23-463b-85af-bfc2388e9f77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffce7d629aa24a7f970d93b2a79045f1', 'neutron:revision_number': '9', 'neutron:security_group_ids': '12e9168a-be86-462f-a658-971f38e3430f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e183e2c6-21dc-48e3-ae47-279bc8b32eeb, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=f1306fa9-9429-43db-a3f4-48a2399611d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.380 103246 INFO neutron.agent.ovn.metadata.agent [-] Port f1306fa9-9429-43db-a3f4-48a2399611d7 in datapath 20eb29be-ee23-463b-85af-bfc2388e9f77 unbound from our chassis#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.382 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20eb29be-ee23-463b-85af-bfc2388e9f77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.383 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ee47e639-3b20-4528-9d3d-ec809efc4c00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.383 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 namespace which is not needed anymore#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000072.scope: Deactivated successfully.
Oct  2 08:24:20 np0005466012 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000072.scope: Consumed 6.505s CPU time.
Oct  2 08:24:20 np0005466012 systemd-machined[152114]: Machine qemu-54-instance-00000072 terminated.
Oct  2 08:24:20 np0005466012 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[237665]: [NOTICE]   (237669) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:20 np0005466012 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[237665]: [NOTICE]   (237669) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:20 np0005466012 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[237665]: [ALERT]    (237669) : Current worker (237671) exited with code 143 (Terminated)
Oct  2 08:24:20 np0005466012 neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77[237665]: [WARNING]  (237669) : All workers exited. Exiting... (0)
Oct  2 08:24:20 np0005466012 systemd[1]: libpod-a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682.scope: Deactivated successfully.
Oct  2 08:24:20 np0005466012 podman[237704]: 2025-10-02 12:24:20.521316995 +0000 UTC m=+0.043222483 container died a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:20 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:20 np0005466012 systemd[1]: var-lib-containers-storage-overlay-08e10f6517d28fe491832b65dd97ac75ccc46dedf98199e912f524bc9e4f776c-merged.mount: Deactivated successfully.
Oct  2 08:24:20 np0005466012 podman[237704]: 2025-10-02 12:24:20.568861901 +0000 UTC m=+0.090767389 container cleanup a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:24:20 np0005466012 systemd[1]: libpod-conmon-a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682.scope: Deactivated successfully.
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.600 2 INFO nova.virt.libvirt.driver [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Instance destroyed successfully.#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.600 2 DEBUG nova.objects.instance [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lazy-loading 'resources' on Instance uuid ae6bf863-8cca-48ab-a98f-065f8382fa99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.620 2 DEBUG nova.virt.libvirt.vif [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:22:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1769053978',display_name='tempest-ServerActionsTestOtherB-server-1769053978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1769053978',id=114,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+aqSe4de2VLtRAXN5xeLQn4S/3X8QrNMy2M5WdQ5hviVyEOgqK+m+uWmzPaUSUgE38sEdkytfwUHD32CBZajBt4q3OEf9i3yPJUQGuqp42pAUD+A3EoBIyeptNeSxGdA==',key_name='tempest-keypair-1900171990',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffce7d629aa24a7f970d93b2a79045f1',ramdisk_id='',reservation_id='r-flcxdim8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-263921372',owner_user_name='tempest-ServerActionsTestOtherB-263921372-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0ea122e2fff94f2ba7c78bf30b04029c',uuid=ae6bf863-8cca-48ab-a98f-065f8382fa99,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.620 2 DEBUG nova.network.os_vif_util [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converting VIF {"id": "f1306fa9-9429-43db-a3f4-48a2399611d7", "address": "fa:16:3e:41:ec:88", "network": {"id": "20eb29be-ee23-463b-85af-bfc2388e9f77", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-370285634-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffce7d629aa24a7f970d93b2a79045f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1306fa9-94", "ovs_interfaceid": "f1306fa9-9429-43db-a3f4-48a2399611d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.621 2 DEBUG nova.network.os_vif_util [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.621 2 DEBUG os_vif [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.623 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1306fa9-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 podman[237745]: 2025-10-02 12:24:20.628613145 +0000 UTC m=+0.038166079 container remove a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.628 2 INFO os_vif [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:ec:88,bridge_name='br-int',has_traffic_filtering=True,id=f1306fa9-9429-43db-a3f4-48a2399611d7,network=Network(20eb29be-ee23-463b-85af-bfc2388e9f77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1306fa9-94')#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.629 2 INFO nova.virt.libvirt.driver [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Deleting instance files /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99_del#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.633 2 INFO nova.virt.libvirt.driver [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Deletion of /var/lib/nova/instances/ae6bf863-8cca-48ab-a98f-065f8382fa99_del complete#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.634 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb4dd40-de56-4e54-a22a-ca1b13654e8d]: (4, ('Thu Oct  2 12:24:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 (a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682)\na9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682\nThu Oct  2 12:24:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 (a9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682)\na9d263979a9b3c56d4f28954bc3f589cb486a2585dae97b1cef1b28dc4974682\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.635 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[57357fcd-1b34-4458-aefb-ed7df4c680e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.636 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20eb29be-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 kernel: tap20eb29be-e0: left promiscuous mode
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.644 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6a905923-4a67-4e2c-8fd5-66453fd78f5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.679 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf12e9b-c31d-4a7b-b29b-33b2a1dc3b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.680 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3db0d1-1d84-44a2-bf3e-90b4e631b3f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.706 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[44e2a31c-8923-43be-a552-13ca42519c1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584761, 'reachable_time': 16424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237764, 'error': None, 'target': 'ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 systemd[1]: run-netns-ovnmeta\x2d20eb29be\x2dee23\x2d463b\x2d85af\x2dbfc2388e9f77.mount: Deactivated successfully.
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.712 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20eb29be-ee23-463b-85af-bfc2388e9f77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:20.713 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6a902c-86e3-4d1c-ab5d-f78048c0069c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.723 2 INFO nova.compute.manager [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.724 2 DEBUG oslo.service.loopingcall [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.724 2 DEBUG nova.compute.manager [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:24:20 np0005466012 nova_compute[192063]: 2025-10-02 12:24:20.725 2 DEBUG nova.network.neutron [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.279 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-vif-unplugged-f1306fa9-9429-43db-a3f4-48a2399611d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.280 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.281 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.282 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.282 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] No waiting events found dispatching network-vif-unplugged-f1306fa9-9429-43db-a3f4-48a2399611d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.283 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-vif-unplugged-f1306fa9-9429-43db-a3f4-48a2399611d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.283 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.283 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.284 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.284 2 DEBUG oslo_concurrency.lockutils [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.284 2 DEBUG nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] No waiting events found dispatching network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.285 2 WARNING nova.compute.manager [req-b578d05d-b906-4644-8848-96c72096fa70 req-75599a94-b302-4091-b1ad-19db97ecba72 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received unexpected event network-vif-plugged-f1306fa9-9429-43db-a3f4-48a2399611d7 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.768 2 DEBUG nova.network.neutron [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.792 2 INFO nova.compute.manager [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Took 1.07 seconds to deallocate network for instance.#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.883 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.884 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.965 2 DEBUG nova.compute.provider_tree [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:21 np0005466012 nova_compute[192063]: 2025-10-02 12:24:21.986 2 DEBUG nova.scheduler.client.report [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:22 np0005466012 nova_compute[192063]: 2025-10-02 12:24:22.005 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:22 np0005466012 nova_compute[192063]: 2025-10-02 12:24:22.025 2 INFO nova.scheduler.client.report [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Deleted allocations for instance ae6bf863-8cca-48ab-a98f-065f8382fa99#033[00m
Oct  2 08:24:22 np0005466012 nova_compute[192063]: 2025-10-02 12:24:22.037 2 DEBUG nova.compute.manager [req-a2934afd-3560-4112-9f3f-3b375f22cf4c req-8d0d0b2f-8579-4a9c-a69b-5d4b8713cdd2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Received event network-vif-deleted-f1306fa9-9429-43db-a3f4-48a2399611d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:22 np0005466012 nova_compute[192063]: 2025-10-02 12:24:22.101 2 DEBUG oslo_concurrency.lockutils [None req-1b89b2b3-4772-4385-8b24-62abada96336 0ea122e2fff94f2ba7c78bf30b04029c ffce7d629aa24a7f970d93b2a79045f1 - - default default] Lock "ae6bf863-8cca-48ab-a98f-065f8382fa99" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:25 np0005466012 nova_compute[192063]: 2025-10-02 12:24:25.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:25 np0005466012 nova_compute[192063]: 2025-10-02 12:24:25.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:26 np0005466012 nova_compute[192063]: 2025-10-02 12:24:26.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:27 np0005466012 podman[237777]: 2025-10-02 12:24:27.133802592 +0000 UTC m=+0.051014817 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:24:27 np0005466012 podman[237778]: 2025-10-02 12:24:27.17546474 +0000 UTC m=+0.086426696 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:24:27 np0005466012 nova_compute[192063]: 2025-10-02 12:24:27.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:28 np0005466012 nova_compute[192063]: 2025-10-02 12:24:28.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:29 np0005466012 nova_compute[192063]: 2025-10-02 12:24:29.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:29 np0005466012 nova_compute[192063]: 2025-10-02 12:24:29.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:29 np0005466012 nova_compute[192063]: 2025-10-02 12:24:29.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:24:29 np0005466012 nova_compute[192063]: 2025-10-02 12:24:29.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:29 np0005466012 nova_compute[192063]: 2025-10-02 12:24:29.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:24:30 np0005466012 nova_compute[192063]: 2025-10-02 12:24:30.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:31 np0005466012 nova_compute[192063]: 2025-10-02 12:24:31.007 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:31 np0005466012 podman[237828]: 2025-10-02 12:24:31.168805786 +0000 UTC m=+0.073283861 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:24:31 np0005466012 nova_compute[192063]: 2025-10-02 12:24:31.391 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:31 np0005466012 nova_compute[192063]: 2025-10-02 12:24:31.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:32 np0005466012 nova_compute[192063]: 2025-10-02 12:24:32.856 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:32 np0005466012 nova_compute[192063]: 2025-10-02 12:24:32.881 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:32 np0005466012 nova_compute[192063]: 2025-10-02 12:24:32.882 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:32 np0005466012 nova_compute[192063]: 2025-10-02 12:24:32.882 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:32 np0005466012 nova_compute[192063]: 2025-10-02 12:24:32.882 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:24:32 np0005466012 nova_compute[192063]: 2025-10-02 12:24:32.971 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:33 np0005466012 podman[237847]: 2025-10-02 12:24:33.007501289 +0000 UTC m=+0.078371976 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.046 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.048 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.125 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.303 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.305 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5484MB free_disk=73.29029083251953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.305 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.305 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.473 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance ae56113d-001e-4f10-9236-c07fe5146d9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.473 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.474 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.587 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:24:33 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:33Z|00444|binding|INFO|Releasing lport 38f1ac16-18c6-4b4a-b769-ebc7dd5181d4 from this chassis (sb_readonly=0)
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.643 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.678 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.679 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:33 np0005466012 nova_compute[192063]: 2025-10-02 12:24:33.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:34 np0005466012 nova_compute[192063]: 2025-10-02 12:24:34.641 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:34 np0005466012 nova_compute[192063]: 2025-10-02 12:24:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:35 np0005466012 nova_compute[192063]: 2025-10-02 12:24:35.598 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407860.5983422, ae6bf863-8cca-48ab-a98f-065f8382fa99 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:35 np0005466012 nova_compute[192063]: 2025-10-02 12:24:35.599 2 INFO nova.compute.manager [-] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:24:35 np0005466012 nova_compute[192063]: 2025-10-02 12:24:35.630 2 DEBUG nova.compute.manager [None req-f0acfaea-7e37-4447-9a9f-d815e245bcf4 - - - - - -] [instance: ae6bf863-8cca-48ab-a98f-065f8382fa99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:35 np0005466012 nova_compute[192063]: 2025-10-02 12:24:35.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:36 np0005466012 nova_compute[192063]: 2025-10-02 12:24:36.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:37 np0005466012 nova_compute[192063]: 2025-10-02 12:24:37.642 2 DEBUG oslo_concurrency.lockutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:37 np0005466012 nova_compute[192063]: 2025-10-02 12:24:37.642 2 DEBUG oslo_concurrency.lockutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:37 np0005466012 nova_compute[192063]: 2025-10-02 12:24:37.642 2 DEBUG nova.network.neutron [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:24:39 np0005466012 podman[237874]: 2025-10-02 12:24:39.174771519 +0000 UTC m=+0.079857179 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd)
Oct  2 08:24:39 np0005466012 podman[237875]: 2025-10-02 12:24:39.175746886 +0000 UTC m=+0.083343657 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:24:40 np0005466012 nova_compute[192063]: 2025-10-02 12:24:40.160 2 DEBUG nova.network.neutron [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:40 np0005466012 nova_compute[192063]: 2025-10-02 12:24:40.191 2 DEBUG oslo_concurrency.lockutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:40 np0005466012 nova_compute[192063]: 2025-10-02 12:24:40.621 2 DEBUG nova.virt.libvirt.driver [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  2 08:24:40 np0005466012 nova_compute[192063]: 2025-10-02 12:24:40.622 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Creating file /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/18589f24c5ed452aa7487b4e36df1c7d.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  2 08:24:40 np0005466012 nova_compute[192063]: 2025-10-02 12:24:40.622 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/18589f24c5ed452aa7487b4e36df1c7d.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:40 np0005466012 nova_compute[192063]: 2025-10-02 12:24:40.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.008 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/18589f24c5ed452aa7487b4e36df1c7d.tmp" returned: 1 in 0.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.009 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/18589f24c5ed452aa7487b4e36df1c7d.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.010 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Creating directory /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.010 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.214 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.222 2 DEBUG nova.virt.libvirt.driver [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:24:41 np0005466012 nova_compute[192063]: 2025-10-02 12:24:41.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:42 np0005466012 nova_compute[192063]: 2025-10-02 12:24:42.600 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:42 np0005466012 nova_compute[192063]: 2025-10-02 12:24:42.601 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:42 np0005466012 nova_compute[192063]: 2025-10-02 12:24:42.601 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:24:42 np0005466012 nova_compute[192063]: 2025-10-02 12:24:42.602 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:42 np0005466012 nova_compute[192063]: 2025-10-02 12:24:42.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 podman[237917]: 2025-10-02 12:24:43.136802142 +0000 UTC m=+0.056928284 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2)
Oct  2 08:24:43 np0005466012 podman[237918]: 2025-10-02 12:24:43.149426382 +0000 UTC m=+0.068533386 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:24:43 np0005466012 kernel: tapd1031883-21 (unregistering): left promiscuous mode
Oct  2 08:24:43 np0005466012 NetworkManager[51207]: <info>  [1759407883.3991] device (tapd1031883-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:24:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:43Z|00445|binding|INFO|Releasing lport d1031883-2135-4183-8a9d-0609c32ad14b from this chassis (sb_readonly=0)
Oct  2 08:24:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:43Z|00446|binding|INFO|Setting lport d1031883-2135-4183-8a9d-0609c32ad14b down in Southbound
Oct  2 08:24:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:24:43Z|00447|binding|INFO|Removing iface tapd1031883-21 ovn-installed in OVS
Oct  2 08:24:43 np0005466012 nova_compute[192063]: 2025-10-02 12:24:43.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 nova_compute[192063]: 2025-10-02 12:24:43.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Oct  2 08:24:43 np0005466012 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006f.scope: Consumed 16.371s CPU time.
Oct  2 08:24:43 np0005466012 systemd-machined[152114]: Machine qemu-53-instance-0000006f terminated.
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.595 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b9:ae 10.100.0.12'], port_security=['fa:16:3e:0a:b9:ae 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae56113d-001e-4f10-9236-c07fe5146d9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a04f937a-375f-4fb0-90fe-5f514a88668f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e564a4cad5d443dba81ec04d2a05ced9', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'c0383701-0ec7-4f3b-8585-5effc4f5ca5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.248', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c0aa38-5fd8-41c7-b4bf-85b59722c5c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d1031883-2135-4183-8a9d-0609c32ad14b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.597 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d1031883-2135-4183-8a9d-0609c32ad14b in datapath a04f937a-375f-4fb0-90fe-5f514a88668f unbound from our chassis#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.599 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a04f937a-375f-4fb0-90fe-5f514a88668f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.600 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[79743073-fcdf-4653-8277-8249d851689d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.601 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f namespace which is not needed anymore#033[00m
Oct  2 08:24:43 np0005466012 nova_compute[192063]: 2025-10-02 12:24:43.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 nova_compute[192063]: 2025-10-02 12:24:43.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [NOTICE]   (237157) : haproxy version is 2.8.14-c23fe91
Oct  2 08:24:43 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [NOTICE]   (237157) : path to executable is /usr/sbin/haproxy
Oct  2 08:24:43 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [WARNING]  (237157) : Exiting Master process...
Oct  2 08:24:43 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [ALERT]    (237157) : Current worker (237159) exited with code 143 (Terminated)
Oct  2 08:24:43 np0005466012 neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f[237153]: [WARNING]  (237157) : All workers exited. Exiting... (0)
Oct  2 08:24:43 np0005466012 systemd[1]: libpod-78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda.scope: Deactivated successfully.
Oct  2 08:24:43 np0005466012 podman[237999]: 2025-10-02 12:24:43.770289877 +0000 UTC m=+0.047557238 container died 78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:24:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda-userdata-shm.mount: Deactivated successfully.
Oct  2 08:24:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay-00af8bac3af959994a3c6efe95e21cd88f1b9d5d2b5200cad26210961658384b-merged.mount: Deactivated successfully.
Oct  2 08:24:43 np0005466012 podman[237999]: 2025-10-02 12:24:43.806950032 +0000 UTC m=+0.084217353 container cleanup 78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:24:43 np0005466012 systemd[1]: libpod-conmon-78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda.scope: Deactivated successfully.
Oct  2 08:24:43 np0005466012 podman[238026]: 2025-10-02 12:24:43.877838814 +0000 UTC m=+0.046404455 container remove 78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.886 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[977d5ab8-90be-42f5-9579-d12374c9b606]: (4, ('Thu Oct  2 12:24:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda)\n78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda\nThu Oct  2 12:24:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f (78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda)\n78603dc3e2aa746663d6cb4c11f21c3188e42aafa85b2bd98665a6a064779bda\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.889 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d80bd073-2200-4bf3-a596-51a43ad073de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.890 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04f937a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:43 np0005466012 nova_compute[192063]: 2025-10-02 12:24:43.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 kernel: tapa04f937a-30: left promiscuous mode
Oct  2 08:24:43 np0005466012 nova_compute[192063]: 2025-10-02 12:24:43.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.951 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f23d9e-cc39-4052-9f9c-ffde0964ee73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.980 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[54f5a2c3-09bc-42c2-ab11-d8fac0fb5ec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:43.981 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1de60c00-a4c2-4d49-8b48-a3d881a1dea8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:44.000 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[41526b40-caaa-421d-a610-3ed08b847bc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578522, 'reachable_time': 40618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238048, 'error': None, 'target': 'ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:44.002 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a04f937a-375f-4fb0-90fe-5f514a88668f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:24:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:44.003 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[02e0933d-6667-4255-824a-948189c3fb40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:44 np0005466012 systemd[1]: run-netns-ovnmeta\x2da04f937a\x2d375f\x2d4fb0\x2d90fe\x2d5f514a88668f.mount: Deactivated successfully.
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.243 2 INFO nova.virt.libvirt.driver [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.252 2 INFO nova.virt.libvirt.driver [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Instance destroyed successfully.#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.254 2 DEBUG nova.virt.libvirt.vif [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:21:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1926715354-network", "vif_mac": "fa:16:3e:0a:b9:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.254 2 DEBUG nova.network.os_vif_util [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1926715354-network", "vif_mac": "fa:16:3e:0a:b9:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.256 2 DEBUG nova.network.os_vif_util [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.257 2 DEBUG os_vif [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1031883-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.268 2 INFO os_vif [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.274 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.364 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.365 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.432 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.435 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Copying file /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk to 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:24:44 np0005466012 nova_compute[192063]: 2025-10-02 12:24:44.435 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.020 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "scp -r /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.021 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Copying file /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.021 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk.config 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.047 2 DEBUG nova.compute.manager [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.048 2 DEBUG oslo_concurrency.lockutils [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.049 2 DEBUG oslo_concurrency.lockutils [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.049 2 DEBUG oslo_concurrency.lockutils [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.050 2 DEBUG nova.compute.manager [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.050 2 WARNING nova.compute.manager [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-unplugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.051 2 DEBUG nova.compute.manager [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.051 2 DEBUG oslo_concurrency.lockutils [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.052 2 DEBUG oslo_concurrency.lockutils [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.052 2 DEBUG oslo_concurrency.lockutils [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.053 2 DEBUG nova.compute.manager [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.053 2 WARNING nova.compute.manager [req-642253c1-2f32-49b9-b62a-3602531c3ded req-387c8624-66fc-4f5d-87f7-ea0daeffafda 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.244 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "scp -C -r /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk.config 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.config" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.245 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Copying file /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.245 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk.info 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.479 2 DEBUG oslo_concurrency.processutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] CMD "scp -C -r /var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c_resize/disk.info 192.168.122.100:/var/lib/nova/instances/ae56113d-001e-4f10-9236-c07fe5146d9c/disk.info" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.678 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.747 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.748 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.750 2 DEBUG neutronclient.v2_0.client [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d1031883-2135-4183-8a9d-0609c32ad14b for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:24:45 np0005466012 nova_compute[192063]: 2025-10-02 12:24:45.893 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:24:46 np0005466012 nova_compute[192063]: 2025-10-02 12:24:46.555 2 DEBUG oslo_concurrency.lockutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:46 np0005466012 nova_compute[192063]: 2025-10-02 12:24:46.556 2 DEBUG oslo_concurrency.lockutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:46 np0005466012 nova_compute[192063]: 2025-10-02 12:24:46.556 2 DEBUG oslo_concurrency.lockutils [None req-819407d4-1366-4493-9427-0896e8ec66b3 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:46 np0005466012 nova_compute[192063]: 2025-10-02 12:24:46.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005466012 nova_compute[192063]: 2025-10-02 12:24:47.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:47 np0005466012 nova_compute[192063]: 2025-10-02 12:24:47.907 2 DEBUG nova.compute.manager [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-changed-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:47 np0005466012 nova_compute[192063]: 2025-10-02 12:24:47.907 2 DEBUG nova.compute.manager [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Refreshing instance network info cache due to event network-changed-d1031883-2135-4183-8a9d-0609c32ad14b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:24:47 np0005466012 nova_compute[192063]: 2025-10-02 12:24:47.907 2 DEBUG oslo_concurrency.lockutils [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:47 np0005466012 nova_compute[192063]: 2025-10-02 12:24:47.907 2 DEBUG oslo_concurrency.lockutils [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:47 np0005466012 nova_compute[192063]: 2025-10-02 12:24:47.908 2 DEBUG nova.network.neutron [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Refreshing network info cache for port d1031883-2135-4183-8a9d-0609c32ad14b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:24:48 np0005466012 nova_compute[192063]: 2025-10-02 12:24:48.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:49 np0005466012 nova_compute[192063]: 2025-10-02 12:24:49.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:50 np0005466012 nova_compute[192063]: 2025-10-02 12:24:50.440 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:50 np0005466012 nova_compute[192063]: 2025-10-02 12:24:50.482 2 WARNING nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] While synchronizing instance power states, found 0 instances in the database and 1 instances on the hypervisor.#033[00m
Oct  2 08:24:51 np0005466012 nova_compute[192063]: 2025-10-02 12:24:51.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:52 np0005466012 nova_compute[192063]: 2025-10-02 12:24:52.147 2 DEBUG nova.network.neutron [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updated VIF entry in instance network info cache for port d1031883-2135-4183-8a9d-0609c32ad14b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:24:52 np0005466012 nova_compute[192063]: 2025-10-02 12:24:52.148 2 DEBUG nova.network.neutron [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:24:52 np0005466012 nova_compute[192063]: 2025-10-02 12:24:52.514 2 DEBUG oslo_concurrency.lockutils [req-e995343e-6992-4e13-ae68-af1b7d3ef6c7 req-e4706ac4-3eb0-41f3-bad7-8001f407cb19 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:24:54 np0005466012 nova_compute[192063]: 2025-10-02 12:24:54.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466012 nova_compute[192063]: 2025-10-02 12:24:54.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466012 nova_compute[192063]: 2025-10-02 12:24:54.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:54.468 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:74:6f 10.100.0.2 2001:db8::f816:3eff:feea:746f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feea:746f/64', 'neutron:device_id': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2784fb0-50ac-4c91-ba90-3b5c38b8adf4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=adc60e93-14bb-4eb4-8a79-15dda196dc01) old=Port_Binding(mac=['fa:16:3e:ea:74:6f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:24:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:54.470 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port adc60e93-14bb-4eb4-8a79-15dda196dc01 in datapath 26df2dcf-f57c-4dae-8522-0277df741ed3 updated#033[00m
Oct  2 08:24:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:54.471 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 26df2dcf-f57c-4dae-8522-0277df741ed3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:24:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:24:54.472 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b3a1cd-953a-46a4-a645-9a53c8e88880]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:24:54 np0005466012 nova_compute[192063]: 2025-10-02 12:24:54.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:24:55 np0005466012 nova_compute[192063]: 2025-10-02 12:24:55.861 2 DEBUG nova.compute.manager [req-629f90f1-58cf-4486-b8de-badbd4846e54 req-7d60d8ae-f3e5-48ed-a45f-252d3208c3d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:55 np0005466012 nova_compute[192063]: 2025-10-02 12:24:55.861 2 DEBUG oslo_concurrency.lockutils [req-629f90f1-58cf-4486-b8de-badbd4846e54 req-7d60d8ae-f3e5-48ed-a45f-252d3208c3d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:55 np0005466012 nova_compute[192063]: 2025-10-02 12:24:55.862 2 DEBUG oslo_concurrency.lockutils [req-629f90f1-58cf-4486-b8de-badbd4846e54 req-7d60d8ae-f3e5-48ed-a45f-252d3208c3d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:55 np0005466012 nova_compute[192063]: 2025-10-02 12:24:55.862 2 DEBUG oslo_concurrency.lockutils [req-629f90f1-58cf-4486-b8de-badbd4846e54 req-7d60d8ae-f3e5-48ed-a45f-252d3208c3d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:55 np0005466012 nova_compute[192063]: 2025-10-02 12:24:55.862 2 DEBUG nova.compute.manager [req-629f90f1-58cf-4486-b8de-badbd4846e54 req-7d60d8ae-f3e5-48ed-a45f-252d3208c3d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:55 np0005466012 nova_compute[192063]: 2025-10-02 12:24:55.863 2 WARNING nova.compute.manager [req-629f90f1-58cf-4486-b8de-badbd4846e54 req-7d60d8ae-f3e5-48ed-a45f-252d3208c3d5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:24:56 np0005466012 nova_compute[192063]: 2025-10-02 12:24:56.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:24:57 np0005466012 nova_compute[192063]: 2025-10-02 12:24:57.494 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:57 np0005466012 nova_compute[192063]: 2025-10-02 12:24:57.495 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:57 np0005466012 nova_compute[192063]: 2025-10-02 12:24:57.495 2 DEBUG nova.compute.manager [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Going to confirm migration 17 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  2 08:24:57 np0005466012 nova_compute[192063]: 2025-10-02 12:24:57.559 2 DEBUG nova.objects.instance [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'info_cache' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:24:58 np0005466012 podman[238063]: 2025-10-02 12:24:58.146571339 +0000 UTC m=+0.057809560 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:24:58 np0005466012 podman[238064]: 2025-10-02 12:24:58.174594458 +0000 UTC m=+0.084849990 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.232 2 DEBUG neutronclient.v2_0.client [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d1031883-2135-4183-8a9d-0609c32ad14b for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.233 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.234 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquired lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.234 2 DEBUG nova.network.neutron [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.440 2 DEBUG nova.compute.manager [req-1971ad18-0f58-4fed-9e86-9060aaa46f99 req-4da7bdda-201a-48b0-90de-c8a8b96ec1c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.441 2 DEBUG oslo_concurrency.lockutils [req-1971ad18-0f58-4fed-9e86-9060aaa46f99 req-4da7bdda-201a-48b0-90de-c8a8b96ec1c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.441 2 DEBUG oslo_concurrency.lockutils [req-1971ad18-0f58-4fed-9e86-9060aaa46f99 req-4da7bdda-201a-48b0-90de-c8a8b96ec1c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.441 2 DEBUG oslo_concurrency.lockutils [req-1971ad18-0f58-4fed-9e86-9060aaa46f99 req-4da7bdda-201a-48b0-90de-c8a8b96ec1c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.441 2 DEBUG nova.compute.manager [req-1971ad18-0f58-4fed-9e86-9060aaa46f99 req-4da7bdda-201a-48b0-90de-c8a8b96ec1c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] No waiting events found dispatching network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.442 2 WARNING nova.compute.manager [req-1971ad18-0f58-4fed-9e86-9060aaa46f99 req-4da7bdda-201a-48b0-90de-c8a8b96ec1c7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Received unexpected event network-vif-plugged-d1031883-2135-4183-8a9d-0609c32ad14b for instance with vm_state resized and task_state None.#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.701 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407883.7002048, ae56113d-001e-4f10-9236-c07fe5146d9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.702 2 INFO nova.compute.manager [-] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.810 2 DEBUG nova.compute.manager [None req-90353172-9dc2-43a2-b517-0f49fd4d69ce - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.815 2 DEBUG nova.compute.manager [None req-90353172-9dc2-43a2-b517-0f49fd4d69ce - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:24:58 np0005466012 nova_compute[192063]: 2025-10-02 12:24:58.911 2 INFO nova.compute.manager [None req-90353172-9dc2-43a2-b517-0f49fd4d69ce - - - - - -] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  2 08:24:59 np0005466012 nova_compute[192063]: 2025-10-02 12:24:59.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:00 np0005466012 nova_compute[192063]: 2025-10-02 12:25:00.535 2 DEBUG nova.network.neutron [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] [instance: ae56113d-001e-4f10-9236-c07fe5146d9c] Updating instance_info_cache with network_info: [{"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:00 np0005466012 nova_compute[192063]: 2025-10-02 12:25:00.814 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Releasing lock "refresh_cache-ae56113d-001e-4f10-9236-c07fe5146d9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:00 np0005466012 nova_compute[192063]: 2025-10-02 12:25:00.814 2 DEBUG nova.objects.instance [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lazy-loading 'migration_context' on Instance uuid ae56113d-001e-4f10-9236-c07fe5146d9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.001 2 DEBUG nova.virt.libvirt.vif [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-161503604',display_name='tempest-ServerActionsTestJSON-server-161503604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-161503604',id=111,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJLom+UJzZg9dduKQv+725QaYDZoMXvP/xlpKnb/K05SGc4dkyLwCDweJ3QifTmxLWqK9Sz5A12yMJbzpa36v5C4bUqj8uiWk/vbR1BAjBdKM9d/Ug8M2nT8LwDBGP/9A==',key_name='tempest-keypair-1006285918',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:24:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e564a4cad5d443dba81ec04d2a05ced9',ramdisk_id='',reservation_id='r-ntvf7r4i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1646745100',owner_user_name='tempest-ServerActionsTestJSON-1646745100-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:24:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d54b1826121b47caba89932a78c06ccd',uuid=ae56113d-001e-4f10-9236-c07fe5146d9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.002 2 DEBUG nova.network.os_vif_util [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converting VIF {"id": "d1031883-2135-4183-8a9d-0609c32ad14b", "address": "fa:16:3e:0a:b9:ae", "network": {"id": "a04f937a-375f-4fb0-90fe-5f514a88668f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1926715354-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e564a4cad5d443dba81ec04d2a05ced9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1031883-21", "ovs_interfaceid": "d1031883-2135-4183-8a9d-0609c32ad14b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.003 2 DEBUG nova.network.os_vif_util [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.003 2 DEBUG os_vif [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.005 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1031883-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.006 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.008 2 INFO os_vif [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:b9:ae,bridge_name='br-int',has_traffic_filtering=True,id=d1031883-2135-4183-8a9d-0609c32ad14b,network=Network(a04f937a-375f-4fb0-90fe-5f514a88668f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1031883-21')#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.009 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.009 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.417 2 DEBUG nova.compute.provider_tree [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.582 2 DEBUG nova.scheduler.client.report [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:01 np0005466012 nova_compute[192063]: 2025-10-02 12:25:01.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:02 np0005466012 nova_compute[192063]: 2025-10-02 12:25:02.119 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:02 np0005466012 podman[238110]: 2025-10-02 12:25:02.161594943 +0000 UTC m=+0.079953110 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  2 08:25:02 np0005466012 nova_compute[192063]: 2025-10-02 12:25:02.883 2 INFO nova.scheduler.client.report [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Deleted allocation for migration ab9c53f2-4424-4021-b1fd-891b3ab4902d#033[00m
Oct  2 08:25:03 np0005466012 podman[238128]: 2025-10-02 12:25:03.170722931 +0000 UTC m=+0.083882503 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Oct  2 08:25:03 np0005466012 nova_compute[192063]: 2025-10-02 12:25:03.438 2 DEBUG oslo_concurrency.lockutils [None req-5102790d-62ed-4b14-a4d2-228bf6b84504 d54b1826121b47caba89932a78c06ccd e564a4cad5d443dba81ec04d2a05ced9 - - default default] Lock "ae56113d-001e-4f10-9236-c07fe5146d9c" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:04 np0005466012 nova_compute[192063]: 2025-10-02 12:25:04.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:05 np0005466012 nova_compute[192063]: 2025-10-02 12:25:05.512 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:05 np0005466012 nova_compute[192063]: 2025-10-02 12:25:05.512 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:05 np0005466012 nova_compute[192063]: 2025-10-02 12:25:05.822 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.386 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.386 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.393 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.393 2 INFO nova.compute.claims [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.970 2 DEBUG nova.compute.provider_tree [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:06 np0005466012 nova_compute[192063]: 2025-10-02 12:25:06.988 2 DEBUG nova.scheduler.client.report [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.029 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.029 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.110 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.110 2 DEBUG nova.network.neutron [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.215 2 INFO nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.251 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.425 2 DEBUG nova.policy [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.442 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.444 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.444 2 INFO nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Creating image(s)#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.445 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.445 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.446 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.469 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.563 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.564 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.565 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.581 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.647 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.648 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.687 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.688 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.689 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.757 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.760 2 DEBUG nova.virt.disk.api [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.761 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.827 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.828 2 DEBUG nova.virt.disk.api [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.829 2 DEBUG nova.objects.instance [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.847 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.847 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Ensure instance console log exists: /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.847 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.848 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:07 np0005466012 nova_compute[192063]: 2025-10-02 12:25:07.848 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:08.178 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:08.180 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:25:08 np0005466012 nova_compute[192063]: 2025-10-02 12:25:08.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:08 np0005466012 nova_compute[192063]: 2025-10-02 12:25:08.429 2 DEBUG nova.network.neutron [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Successfully created port: 622b7e17-6a86-4876-8e8a-6b40367f483e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.339 2 DEBUG nova.network.neutron [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Successfully updated port: 622b7e17-6a86-4876-8e8a-6b40367f483e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.361 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.361 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.361 2 DEBUG nova.network.neutron [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.427 2 DEBUG nova.compute.manager [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-changed-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.427 2 DEBUG nova.compute.manager [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Refreshing instance network info cache due to event network-changed-622b7e17-6a86-4876-8e8a-6b40367f483e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.428 2 DEBUG oslo_concurrency.lockutils [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:09 np0005466012 nova_compute[192063]: 2025-10-02 12:25:09.544 2 DEBUG nova.network.neutron [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:25:10 np0005466012 podman[238163]: 2025-10-02 12:25:10.177035676 +0000 UTC m=+0.087384544 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:10 np0005466012 podman[238164]: 2025-10-02 12:25:10.182934834 +0000 UTC m=+0.090435450 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6)
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.534 2 DEBUG nova.network.neutron [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updating instance_info_cache with network_info: [{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.574 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.575 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Instance network_info: |[{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.575 2 DEBUG oslo_concurrency.lockutils [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.576 2 DEBUG nova.network.neutron [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Refreshing network info cache for port 622b7e17-6a86-4876-8e8a-6b40367f483e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.580 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Start _get_guest_xml network_info=[{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.585 2 WARNING nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.590 2 DEBUG nova.virt.libvirt.host [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.591 2 DEBUG nova.virt.libvirt.host [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.597 2 DEBUG nova.virt.libvirt.host [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.598 2 DEBUG nova.virt.libvirt.host [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.600 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.601 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.602 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.602 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.603 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.603 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.603 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.604 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.604 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.605 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.605 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.606 2 DEBUG nova.virt.hardware [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.612 2 DEBUG nova.virt.libvirt.vif [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1098429742',display_name='tempest-TestGettingAddress-server-1098429742',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1098429742',id=120,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN27qqZO7DS6SotTIkgadWOrlyFzalcMBya6l3P3FHA92Trdk8QzNk/bIfeVZHQyyH9bzXdJACR3sdrkH4czxiQm1W3dnbgCG/vLQtAxveP29c1TkzsAJfjG23nfB+bI6Q==',key_name='tempest-TestGettingAddress-794970227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-2b2xc028',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:07Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=97dd79e2-9bf5-47c4-8c3f-fa70335c3d37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.613 2 DEBUG nova.network.os_vif_util [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.615 2 DEBUG nova.network.os_vif_util [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.617 2 DEBUG nova.objects.instance [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.642 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <uuid>97dd79e2-9bf5-47c4-8c3f-fa70335c3d37</uuid>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <name>instance-00000078</name>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestGettingAddress-server-1098429742</nova:name>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:25:11</nova:creationTime>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        <nova:port uuid="622b7e17-6a86-4876-8e8a-6b40367f483e">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:febc:631b" ipVersion="6"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <entry name="serial">97dd79e2-9bf5-47c4-8c3f-fa70335c3d37</entry>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <entry name="uuid">97dd79e2-9bf5-47c4-8c3f-fa70335c3d37</entry>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.config"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:bc:63:1b"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <target dev="tap622b7e17-6a"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/console.log" append="off"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:25:11 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:25:11 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:25:11 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:25:11 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.644 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Preparing to wait for external event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.644 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.645 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.645 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.647 2 DEBUG nova.virt.libvirt.vif [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1098429742',display_name='tempest-TestGettingAddress-server-1098429742',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1098429742',id=120,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN27qqZO7DS6SotTIkgadWOrlyFzalcMBya6l3P3FHA92Trdk8QzNk/bIfeVZHQyyH9bzXdJACR3sdrkH4czxiQm1W3dnbgCG/vLQtAxveP29c1TkzsAJfjG23nfB+bI6Q==',key_name='tempest-TestGettingAddress-794970227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-2b2xc028',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:07Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=97dd79e2-9bf5-47c4-8c3f-fa70335c3d37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.648 2 DEBUG nova.network.os_vif_util [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.649 2 DEBUG nova.network.os_vif_util [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.650 2 DEBUG os_vif [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.652 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.652 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.657 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap622b7e17-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.657 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap622b7e17-6a, col_values=(('external_ids', {'iface-id': '622b7e17-6a86-4876-8e8a-6b40367f483e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:63:1b', 'vm-uuid': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:25:11 np0005466012 NetworkManager[51207]: <info>  [1759407911.6626] manager: (tap622b7e17-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.669 2 INFO os_vif [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a')#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.771 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.771 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.772 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:bc:63:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.773 2 INFO nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Using config drive#033[00m
Oct  2 08:25:11 np0005466012 nova_compute[192063]: 2025-10-02 12:25:11.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.636 2 INFO nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Creating config drive at /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.config#033[00m
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.641 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsjfd6pes execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.764 2 DEBUG oslo_concurrency.processutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsjfd6pes" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:12 np0005466012 kernel: tap622b7e17-6a: entered promiscuous mode
Oct  2 08:25:12 np0005466012 NetworkManager[51207]: <info>  [1759407912.8442] manager: (tap622b7e17-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:12Z|00448|binding|INFO|Claiming lport 622b7e17-6a86-4876-8e8a-6b40367f483e for this chassis.
Oct  2 08:25:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:12Z|00449|binding|INFO|622b7e17-6a86-4876-8e8a-6b40367f483e: Claiming fa:16:3e:bc:63:1b 10.100.0.5 2001:db8::f816:3eff:febc:631b
Oct  2 08:25:12 np0005466012 systemd-machined[152114]: New machine qemu-55-instance-00000078.
Oct  2 08:25:12 np0005466012 systemd-udevd[238225]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:12 np0005466012 NetworkManager[51207]: <info>  [1759407912.9303] device (tap622b7e17-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:12 np0005466012 NetworkManager[51207]: <info>  [1759407912.9312] device (tap622b7e17-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:12Z|00450|binding|INFO|Setting lport 622b7e17-6a86-4876-8e8a-6b40367f483e ovn-installed in OVS
Oct  2 08:25:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:12Z|00451|binding|INFO|Setting lport 622b7e17-6a86-4876-8e8a-6b40367f483e up in Southbound
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.941 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:63:1b 10.100.0.5 2001:db8::f816:3eff:febc:631b'], port_security=['fa:16:3e:bc:63:1b 10.100.0.5 2001:db8::f816:3eff:febc:631b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8::f816:3eff:febc:631b/64', 'neutron:device_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2c57d713-64e3-4621-a624-32092d283319', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2784fb0-50ac-4c91-ba90-3b5c38b8adf4, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=622b7e17-6a86-4876-8e8a-6b40367f483e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:25:12 np0005466012 systemd[1]: Started Virtual Machine qemu-55-instance-00000078.
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.942 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 622b7e17-6a86-4876-8e8a-6b40367f483e in datapath 26df2dcf-f57c-4dae-8522-0277df741ed3 bound to our chassis#033[00m
Oct  2 08:25:12 np0005466012 nova_compute[192063]: 2025-10-02 12:25:12.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.945 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 26df2dcf-f57c-4dae-8522-0277df741ed3#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.959 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[78bfa2e2-87fb-437c-876e-af3a81caae16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.960 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap26df2dcf-f1 in ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.961 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap26df2dcf-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.962 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0161cefe-f52b-47ac-a38c-d10ef2f7565f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.962 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[90aabac2-aa5a-49ee-82e6-7e33bbbfabd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:12.975 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[025aff0e-e3d9-4285-999d-adfe5784f560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.002 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2c3bfc-7df7-4d75-9103-3a5c7edcc947]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.031 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[85126fb4-4b42-4287-88ac-d43334c11a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 NetworkManager[51207]: <info>  [1759407913.0366] manager: (tap26df2dcf-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.036 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd96961-e8be-45ee-bcb0-0fdc210f9e33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 systemd-udevd[238227]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.066 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f62616f1-7922-4b62-88fd-a1cf79b53e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.070 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[66dd43ca-4e73-42d0-85b1-38db4a7c0b56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 NetworkManager[51207]: <info>  [1759407913.0915] device (tap26df2dcf-f0): carrier: link connected
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.098 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[49d40ec0-098f-4832-9168-f3017c02aad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.115 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d83eb7ad-8747-4713-9f72-7748f84a3c82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap26df2dcf-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:74:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590670, 'reachable_time': 20863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238258, 'error': None, 'target': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.132 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4d94ee-2864-462a-bcc0-fbc336b47428]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:746f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590670, 'tstamp': 590670}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238261, 'error': None, 'target': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.150 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f0806979-5e89-4e3a-9d48-72f0e0fdaab5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap26df2dcf-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:74:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590670, 'reachable_time': 20863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238266, 'error': None, 'target': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.179 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[40b7aa7f-4022-4bad-a869-3c9f4b45441a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.236 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d32149cd-624b-4924-8fd6-07ddec226d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.237 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26df2dcf-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.237 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.238 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26df2dcf-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:13 np0005466012 NetworkManager[51207]: <info>  [1759407913.2408] manager: (tap26df2dcf-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Oct  2 08:25:13 np0005466012 kernel: tap26df2dcf-f0: entered promiscuous mode
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.247 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap26df2dcf-f0, col_values=(('external_ids', {'iface-id': 'adc60e93-14bb-4eb4-8a79-15dda196dc01'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:13Z|00452|binding|INFO|Releasing lport adc60e93-14bb-4eb4-8a79-15dda196dc01 from this chassis (sb_readonly=0)
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.251 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/26df2dcf-f57c-4dae-8522-0277df741ed3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/26df2dcf-f57c-4dae-8522-0277df741ed3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.252 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d55a56dc-85a2-4a1d-a835-d5be7b10436b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.253 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-26df2dcf-f57c-4dae-8522-0277df741ed3
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/26df2dcf-f57c-4dae-8522-0277df741ed3.pid.haproxy
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 26df2dcf-f57c-4dae-8522-0277df741ed3
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:25:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:13.254 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'env', 'PROCESS_TAG=haproxy-26df2dcf-f57c-4dae-8522-0277df741ed3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/26df2dcf-f57c-4dae-8522-0277df741ed3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.261 2 DEBUG nova.compute.manager [req-bc057151-057d-4984-9987-8d6471b48418 req-dcdbcfbf-6bf2-4d41-9836-3e25056c0a5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.262 2 DEBUG oslo_concurrency.lockutils [req-bc057151-057d-4984-9987-8d6471b48418 req-dcdbcfbf-6bf2-4d41-9836-3e25056c0a5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.263 2 DEBUG oslo_concurrency.lockutils [req-bc057151-057d-4984-9987-8d6471b48418 req-dcdbcfbf-6bf2-4d41-9836-3e25056c0a5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.264 2 DEBUG oslo_concurrency.lockutils [req-bc057151-057d-4984-9987-8d6471b48418 req-dcdbcfbf-6bf2-4d41-9836-3e25056c0a5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.264 2 DEBUG nova.compute.manager [req-bc057151-057d-4984-9987-8d6471b48418 req-dcdbcfbf-6bf2-4d41-9836-3e25056c0a5c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Processing event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:13 np0005466012 podman[238298]: 2025-10-02 12:25:13.685132074 +0000 UTC m=+0.065474158 container create 307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.706 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:25:13 np0005466012 systemd[1]: Started libpod-conmon-307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9.scope.
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.711 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407913.7113247, 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.712 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.714 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.722 2 INFO nova.virt.libvirt.driver [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Instance spawned successfully.#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.723 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.736 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:25:13 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bb00d145ae25c06a3fcc867eafb665e3abc7da065d3e2415c3841c7e929eaee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.748 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:13 np0005466012 podman[238298]: 2025-10-02 12:25:13.659477803 +0000 UTC m=+0.039819897 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:25:13 np0005466012 podman[238298]: 2025-10-02 12:25:13.758151287 +0000 UTC m=+0.138493381 container init 307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.762 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.763 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.764 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.764 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466012 podman[238298]: 2025-10-02 12:25:13.764847357 +0000 UTC m=+0.145189451 container start 307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.764 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.765 2 DEBUG nova.virt.libvirt.driver [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.770 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.770 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407913.7208543, 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.770 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:25:13 np0005466012 podman[238311]: 2025-10-02 12:25:13.776948403 +0000 UTC m=+0.068025121 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:25:13 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [NOTICE]   (238353) : New worker (238359) forked
Oct  2 08:25:13 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [NOTICE]   (238353) : Loading success.
Oct  2 08:25:13 np0005466012 podman[238312]: 2025-10-02 12:25:13.798104016 +0000 UTC m=+0.086396515 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.820 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.824 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407913.7209804, 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.824 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.867 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.870 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.910 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.912 2 INFO nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Took 6.47 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.913 2 DEBUG nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:13 np0005466012 nova_compute[192063]: 2025-10-02 12:25:13.997 2 INFO nova.compute.manager [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Took 7.78 seconds to build instance.#033[00m
Oct  2 08:25:14 np0005466012 nova_compute[192063]: 2025-10-02 12:25:14.158 2 DEBUG oslo_concurrency.lockutils [None req-351318b7-b414-4d9f-a1ef-f000e1ca5f3a 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:14 np0005466012 nova_compute[192063]: 2025-10-02 12:25:14.385 2 DEBUG nova.network.neutron [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updated VIF entry in instance network info cache for port 622b7e17-6a86-4876-8e8a-6b40367f483e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:14 np0005466012 nova_compute[192063]: 2025-10-02 12:25:14.385 2 DEBUG nova.network.neutron [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updating instance_info_cache with network_info: [{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:14 np0005466012 nova_compute[192063]: 2025-10-02 12:25:14.404 2 DEBUG oslo_concurrency.lockutils [req-bd8a4795-4fef-4993-bb29-b7bfb298b02d req-067cb2ec-054b-4df2-9919-46df61bbf8b7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:15 np0005466012 nova_compute[192063]: 2025-10-02 12:25:15.381 2 DEBUG nova.compute.manager [req-7df30455-bc45-419a-9064-ff4796b055a1 req-6d951af1-4497-4df5-848e-6dbedf21cf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:15 np0005466012 nova_compute[192063]: 2025-10-02 12:25:15.382 2 DEBUG oslo_concurrency.lockutils [req-7df30455-bc45-419a-9064-ff4796b055a1 req-6d951af1-4497-4df5-848e-6dbedf21cf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:15 np0005466012 nova_compute[192063]: 2025-10-02 12:25:15.382 2 DEBUG oslo_concurrency.lockutils [req-7df30455-bc45-419a-9064-ff4796b055a1 req-6d951af1-4497-4df5-848e-6dbedf21cf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:15 np0005466012 nova_compute[192063]: 2025-10-02 12:25:15.382 2 DEBUG oslo_concurrency.lockutils [req-7df30455-bc45-419a-9064-ff4796b055a1 req-6d951af1-4497-4df5-848e-6dbedf21cf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:15 np0005466012 nova_compute[192063]: 2025-10-02 12:25:15.382 2 DEBUG nova.compute.manager [req-7df30455-bc45-419a-9064-ff4796b055a1 req-6d951af1-4497-4df5-848e-6dbedf21cf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] No waiting events found dispatching network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:25:15 np0005466012 nova_compute[192063]: 2025-10-02 12:25:15.383 2 WARNING nova.compute.manager [req-7df30455-bc45-419a-9064-ff4796b055a1 req-6d951af1-4497-4df5-848e-6dbedf21cf35 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received unexpected event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:25:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:25:16.183 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:16 np0005466012 nova_compute[192063]: 2025-10-02 12:25:16.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.925 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'name': 'tempest-TestGettingAddress-server-1098429742', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000078', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'hostId': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.928 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 / tap622b7e17-6a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.928 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '264dfc3b-9ee5-47e0-a89c-41c199e4f78a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:16.926202', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da8a6654-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': '2b1392beed40962e30ca3ffb71ca98039e94a725eb77f87b69e6a15f6e9a1c79'}]}, 'timestamp': '2025-10-02 12:25:16.929361', '_unique_id': '0c00efc0049f470c86e9dc300a2a96de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.930 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.931 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>]
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:25:16 np0005466012 nova_compute[192063]: 2025-10-02 12:25:16.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.951 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.952 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09576025-c503-4e14-b178-6e1903ca7cfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:16.931959', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da8dedd8-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'b30de830f522bc092868c7cda5434a41674c77ef636c482dd8bef6fe3faaa44f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:16.931959', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da8dfabc-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': '3142f17bb3cac6246c63957b912a397e3256de3d58e56966695b17c14e7668a5'}]}, 'timestamp': '2025-10-02 12:25:16.952779', '_unique_id': '1a726dce6a84429699b0699d11c8a07a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.953 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.955 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.read.latency volume: 957395990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.956 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.read.latency volume: 789752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4b80d12-5248-4977-96e3-6c6ec68421be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 957395990, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:16.955557', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da8e84c8-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'f709fb587434f0ca97f750c01247dc8245a9ebd2129e5aba47cbfb9fea5d6d0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 789752, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:16.955557', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da8e9ba2-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'e2cba7e6e00b020c95710f7ecfbbcfea2eb3b5b9407725e109406872c6f04620'}]}, 'timestamp': '2025-10-02 12:25:16.957103', '_unique_id': '15c7280d41bf4ac783a1a5778a42f64e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.960 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a485c404-b68d-4259-911e-26f1561a447e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:16.960344', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da8f2e64-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': 'b75da3d7d42a10fed4a6e7606ecd7bd6a1eec12e91b2c48b395ad531bd2d7fbf'}]}, 'timestamp': '2025-10-02 12:25:16.960639', '_unique_id': '01be69975799447982e9fdf1b9847848'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.961 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.962 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce9467c-1f09-4f6f-b95b-ad388d1f53e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:16.962280', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da8f79a0-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': '7b110e1d7df6b476c144b86a0fca8482e2f7e741a07c7f0ed01f806afaede3e7'}]}, 'timestamp': '2025-10-02 12:25:16.962577', '_unique_id': '07c13d1b8fa7425cad1b5e79e4bbfb3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.978 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.979 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d9e2bb7-0974-478c-8a1d-b7e546e70cc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:16.965671', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9206e8-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.642147578, 'message_signature': '726d6f3768e76b561d1897b870e528f9eafc4c0f34bbbc371192eb8dea73a9cd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:16.965671', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da922510-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.642147578, 'message_signature': 'e676fabb41fc6ea4b18d2f485dbc7b8a6714e888593ca6afc778ff9ccf5cccb0'}]}, 'timestamp': '2025-10-02 12:25:16.980232', '_unique_id': '3e9c39a7a0ab4d45a7756a779e9f4c72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.983 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.983 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43ceac59-a379-4c9d-9cc4-d4af3c835d1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:16.983231', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da92b6ba-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.642147578, 'message_signature': '77160125a18afc2105069cc482679813b25724c69724772c5dc22379465095d7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:16.983231', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da92cd58-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.642147578, 'message_signature': '656a6a3efc97f47fdd02ec5106e93615d91e611b93cad06824f8e5e575dddb30'}]}, 'timestamp': '2025-10-02 12:25:16.984573', '_unique_id': '46c5b09d979042869e5db767034087dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.987 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3514b5cf-4fa7-4a72-958d-1f27cff2a89a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:16.987055', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da934116-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': 'd7eae59e567f4c1cf8cd838fea3390322108a9978ae31497e5f73be737ae34f5'}]}, 'timestamp': '2025-10-02 12:25:16.987324', '_unique_id': 'caa57ff7a6254e779f27292369586c88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.990 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fdeee65-d010-452f-864c-753fc27ab582', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:16.990478', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da93cf78-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'b282070f33925c4c997f548385f1bed0a068caf3ecf15250415b3328e39758f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:16.990478', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da93dc66-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'c9d577d4957a17427a375ae1aef3fe5c72306c095df1246f0f5218c394a2e31e'}]}, 'timestamp': '2025-10-02 12:25:16.991288', '_unique_id': '5e42c64a2b7d4d8ab088999f043438fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.993 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.993 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7445cddf-91cb-44c6-8061-80fa8919763a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:16.992986', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da943526-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': '5a976ac5be851d584e703de4c1f24b3303368be042cdd5aa4a854f7f10ccdc6e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:16.992986', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da945632-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'c1d4c8b5b0f7df97d21fc0d3aed24b795a9ff6402951a9d6f3110f4f29acb395'}]}, 'timestamp': '2025-10-02 12:25:16.994627', '_unique_id': 'b02f45f940c54f0eb419a8b823389bcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e9ec177-2794-4656-9b30-3db391004fb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:16.997084', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da94c900-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': '500d537d0f51c535d27167fd51d5ede4076be0a9559d19c5450532f00062468a'}]}, 'timestamp': '2025-10-02 12:25:16.997386', '_unique_id': '5a260f4277e047f29058237c3b4514b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.000 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.000 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>]
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.001 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.023 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/cpu volume: 3170000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b7f80c6-dd3c-4126-b8ab-0e6f5443ccd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3170000000, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'timestamp': '2025-10-02T12:25:17.001787', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da98cd66-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.699404591, 'message_signature': '56b2aff755fb45914a667ac06afabc1fc5035add972c2793ea12f32ad6d68aa6'}]}, 'timestamp': '2025-10-02 12:25:17.023770', '_unique_id': '7a44f0d02eb34bf6a1862df08f5f75f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.025 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.025 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c78b495-a88c-4c35-947a-ac7cfd1ed75d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:17.025592', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9923ba-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': '2fc821b1515fe94374d7caf9feb0149b96dbe0975190669f7bac97b98c14fc16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:17.025592', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da99351c-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': '7f6eba2760e1ecb46563a95a8c47f397850893632c55c2096405067862e17877'}]}, 'timestamp': '2025-10-02 12:25:17.026805', '_unique_id': 'aa799fbd505545fdae2afbfb8a93aeb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.029 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.029 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.029 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.029 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.030 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.030 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>]
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.030 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.030 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd670bc6f-e726-44eb-b5d2-d473067827be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:17.030468', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da99e0ac-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': 'eb31e6bbd805b4c839632dec302b9711c82a6f1ba5957fc25755df49ecc061f3'}]}, 'timestamp': '2025-10-02 12:25:17.030768', '_unique_id': 'efe3be896999402cbb3f520846166a44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.032 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '926cdc4e-48fd-4516-b73b-bded84624465', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:17.032409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da9a2d6e-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': '265865d18a01811b2001a41aca9c4a376c342d13c8976ef949b483483e39cf12'}]}, 'timestamp': '2025-10-02 12:25:17.032733', '_unique_id': 'd958face273d4545a6140dccf7c680d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.033 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.034 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.034 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1098429742>]
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.034 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.035 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '008a0873-0b20-4ce2-989e-62c707bf52eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:17.034816', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9a8bc4-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'dbfdc779927b30c271fe7a8e016e0e5ca4ba37f53a3e6e77b16c172bd9b70958'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:17.034816', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9a96a0-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.608331124, 'message_signature': 'e6d8ec82f5bf26bd2e0646ef6f3b8cc774bcc65a05a40a65e8cc4b8b744abdf2'}]}, 'timestamp': '2025-10-02 12:25:17.035379', '_unique_id': '2113d572b785427bb1fe95f5b6bb597d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdeeedd2-0ea0-4c47-ab14-48ccae88909a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:17.039100', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da9b31f0-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': 'cf8a3408f66042161545bb667213a63d85e79d2e7db7091df3dc7cdc3b73b265'}]}, 'timestamp': '2025-10-02 12:25:17.039364', '_unique_id': '971c95c2d5f74653a3851da62c073d0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.040 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.040 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.041 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '263c35df-4864-4c32-b4f0-0d9309d1762e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-vda', 'timestamp': '2025-10-02T12:25:17.040953', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9b7a34-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.642147578, 'message_signature': 'cea8047f7ddd3d28165f19834660d904999db8547793257c2dec8d91355ff628'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-sda', 'timestamp': '2025-10-02T12:25:17.040953', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'instance-00000078', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9b85ba-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.642147578, 'message_signature': '09cb28d59cb98d1da75f6c47c73df185450036bad80a370a8cffe459cbfb303e'}]}, 'timestamp': '2025-10-02 12:25:17.041532', '_unique_id': '2a31fe31705947729e4171e1373a9bd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c6ee0de-e8c1-4687-8e08-84a28c0e5743', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:17.043143', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da9bd0b0-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': '55ab8a99868955a233fe6ff8b5a701bef73f2a6b362bc7a041a36e6508e77649'}]}, 'timestamp': '2025-10-02 12:25:17.043450', '_unique_id': 'b194e888fcf54a8491991b1215b2b947'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.044 12 DEBUG ceilometer.compute.pollsters [-] 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a461dcfa-7ab1-40d0-a1ef-03e28fcd334e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000078-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-tap622b7e17-6a', 'timestamp': '2025-10-02T12:25:17.044965', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1098429742', 'name': 'tap622b7e17-6a', 'instance_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:63:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap622b7e17-6a'}, 'message_id': 'da9c17dc-9f8a-11f0-b6ee-fa163e01ba27', 'monotonic_time': 5910.60255573, 'message_signature': '7bc5396b81638d230ab5ca867aa0df1434e99624b47e25e9465fc1e81b77579f'}]}, 'timestamp': '2025-10-02 12:25:17.045250', '_unique_id': '9a0eebed866c47c7b2b81d72e0ce40bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:25:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:25:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:25:19 np0005466012 NetworkManager[51207]: <info>  [1759407919.1706] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Oct  2 08:25:19 np0005466012 NetworkManager[51207]: <info>  [1759407919.1716] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:19Z|00453|binding|INFO|Releasing lport adc60e93-14bb-4eb4-8a79-15dda196dc01 from this chassis (sb_readonly=0)
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.496 2 DEBUG nova.compute.manager [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-changed-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.496 2 DEBUG nova.compute.manager [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Refreshing instance network info cache due to event network-changed-622b7e17-6a86-4876-8e8a-6b40367f483e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.496 2 DEBUG oslo_concurrency.lockutils [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.496 2 DEBUG oslo_concurrency.lockutils [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:19 np0005466012 nova_compute[192063]: 2025-10-02 12:25:19.496 2 DEBUG nova.network.neutron [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Refreshing network info cache for port 622b7e17-6a86-4876-8e8a-6b40367f483e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:21 np0005466012 nova_compute[192063]: 2025-10-02 12:25:21.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:21 np0005466012 nova_compute[192063]: 2025-10-02 12:25:21.887 2 DEBUG nova.network.neutron [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updated VIF entry in instance network info cache for port 622b7e17-6a86-4876-8e8a-6b40367f483e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:25:21 np0005466012 nova_compute[192063]: 2025-10-02 12:25:21.887 2 DEBUG nova.network.neutron [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updating instance_info_cache with network_info: [{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:21 np0005466012 nova_compute[192063]: 2025-10-02 12:25:21.927 2 DEBUG oslo_concurrency.lockutils [req-6d943927-540e-4f8d-9c3b-504c952a897e req-68ee9e26-a623-44c4-8c09-85d5626d6efe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:21 np0005466012 nova_compute[192063]: 2025-10-02 12:25:21.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:25 np0005466012 nova_compute[192063]: 2025-10-02 12:25:25.917 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:26 np0005466012 nova_compute[192063]: 2025-10-02 12:25:26.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:26 np0005466012 nova_compute[192063]: 2025-10-02 12:25:26.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:27Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:63:1b 10.100.0.5
Oct  2 08:25:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:27Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:63:1b 10.100.0.5
Oct  2 08:25:28 np0005466012 nova_compute[192063]: 2025-10-02 12:25:28.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:28 np0005466012 nova_compute[192063]: 2025-10-02 12:25:28.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:29 np0005466012 podman[238384]: 2025-10-02 12:25:29.162199076 +0000 UTC m=+0.067688582 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:25:29 np0005466012 podman[238385]: 2025-10-02 12:25:29.225614174 +0000 UTC m=+0.120632711 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:25:29 np0005466012 nova_compute[192063]: 2025-10-02 12:25:29.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:31 np0005466012 nova_compute[192063]: 2025-10-02 12:25:31.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:31 np0005466012 nova_compute[192063]: 2025-10-02 12:25:31.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:31 np0005466012 nova_compute[192063]: 2025-10-02 12:25:31.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:25:31 np0005466012 nova_compute[192063]: 2025-10-02 12:25:31.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:32 np0005466012 nova_compute[192063]: 2025-10-02 12:25:32.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:33 np0005466012 podman[238434]: 2025-10-02 12:25:33.144659362 +0000 UTC m=+0.057049108 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  2 08:25:33 np0005466012 nova_compute[192063]: 2025-10-02 12:25:33.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:33 np0005466012 nova_compute[192063]: 2025-10-02 12:25:33.844 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:33 np0005466012 nova_compute[192063]: 2025-10-02 12:25:33.845 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:33 np0005466012 nova_compute[192063]: 2025-10-02 12:25:33.845 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:33 np0005466012 nova_compute[192063]: 2025-10-02 12:25:33.845 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:25:33 np0005466012 nova_compute[192063]: 2025-10-02 12:25:33.933 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:33 np0005466012 podman[238456]: 2025-10-02 12:25:33.987918949 +0000 UTC m=+0.091876721 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.041 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.043 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.128 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.311 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.313 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5539MB free_disk=73.28311920166016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.313 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.314 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.545 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.545 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.545 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.684 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.881 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.913 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:25:34 np0005466012 nova_compute[192063]: 2025-10-02 12:25:34.913 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:36 np0005466012 nova_compute[192063]: 2025-10-02 12:25:36.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:36 np0005466012 nova_compute[192063]: 2025-10-02 12:25:36.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:37 np0005466012 nova_compute[192063]: 2025-10-02 12:25:37.914 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:41 np0005466012 podman[238485]: 2025-10-02 12:25:41.179464808 +0000 UTC m=+0.088826123 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:25:41 np0005466012 podman[238484]: 2025-10-02 12:25:41.179668734 +0000 UTC m=+0.085681794 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:25:41 np0005466012 nova_compute[192063]: 2025-10-02 12:25:41.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:41 np0005466012 nova_compute[192063]: 2025-10-02 12:25:41.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:42 np0005466012 nova_compute[192063]: 2025-10-02 12:25:42.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:25:42 np0005466012 nova_compute[192063]: 2025-10-02 12:25:42.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:25:42 np0005466012 nova_compute[192063]: 2025-10-02 12:25:42.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:25:43 np0005466012 nova_compute[192063]: 2025-10-02 12:25:43.450 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:43 np0005466012 nova_compute[192063]: 2025-10-02 12:25:43.450 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:43 np0005466012 nova_compute[192063]: 2025-10-02 12:25:43.450 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:25:43 np0005466012 nova_compute[192063]: 2025-10-02 12:25:43.451 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:44 np0005466012 podman[238528]: 2025-10-02 12:25:44.137055648 +0000 UTC m=+0.049240574 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:25:44 np0005466012 podman[238527]: 2025-10-02 12:25:44.145310534 +0000 UTC m=+0.058255293 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:25:46 np0005466012 nova_compute[192063]: 2025-10-02 12:25:46.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:47 np0005466012 nova_compute[192063]: 2025-10-02 12:25:47.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:47 np0005466012 nova_compute[192063]: 2025-10-02 12:25:47.522 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updating instance_info_cache with network_info: [{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:47 np0005466012 nova_compute[192063]: 2025-10-02 12:25:47.557 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:47 np0005466012 nova_compute[192063]: 2025-10-02 12:25:47.557 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:25:49 np0005466012 nova_compute[192063]: 2025-10-02 12:25:49.891 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Creating tmpfile /var/lib/nova/instances/tmpdyrz5icw to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  2 08:25:49 np0005466012 nova_compute[192063]: 2025-10-02 12:25:49.984 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  2 08:25:51 np0005466012 nova_compute[192063]: 2025-10-02 12:25:51.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:51 np0005466012 nova_compute[192063]: 2025-10-02 12:25:51.888 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595aea98-0c3e-45c9-81fe-4643f44fe8d3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  2 08:25:51 np0005466012 nova_compute[192063]: 2025-10-02 12:25:51.918 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:51 np0005466012 nova_compute[192063]: 2025-10-02 12:25:51.919 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:51 np0005466012 nova_compute[192063]: 2025-10-02 12:25:51.919 2 DEBUG nova.network.neutron [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:52 np0005466012 nova_compute[192063]: 2025-10-02 12:25:52.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.806 2 DEBUG nova.network.neutron [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.825 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.825 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.826 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.846 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595aea98-0c3e-45c9-81fe-4643f44fe8d3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.847 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Creating instance directory: /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.847 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Creating disk.info with the contents: {'/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk': 'qcow2', '/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.847 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.848 2 DEBUG nova.objects.instance [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 595aea98-0c3e-45c9-81fe-4643f44fe8d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.849 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.883 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.935 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.936 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.943 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.943 2 INFO nova.compute.claims [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.946 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.946 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.947 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:53 np0005466012 nova_compute[192063]: 2025-10-02 12:25:53.958 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.015 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.016 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.054 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.055 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.055 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.117 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.117 2 DEBUG nova.virt.disk.api [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Checking if we can resize image /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.118 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.148 2 DEBUG nova.compute.provider_tree [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.173 2 DEBUG nova.scheduler.client.report [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.209 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.210 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.215 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.215 2 DEBUG nova.virt.disk.api [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Cannot resize image /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.216 2 DEBUG nova.objects.instance [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lazy-loading 'migration_context' on Instance uuid 595aea98-0c3e-45c9-81fe-4643f44fe8d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.236 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.272 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config 485376" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.274 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config to /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.274 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.309 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.310 2 DEBUG nova.network.neutron [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.345 2 INFO nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.383 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.531 2 DEBUG nova.policy [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.656 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.657 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.657 2 INFO nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Creating image(s)#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.658 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.658 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.659 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.672 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.759 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.760 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.760 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.775 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.830 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.831 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.856 2 DEBUG oslo_concurrency.processutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3/disk.config /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.857 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.859 2 DEBUG nova.virt.libvirt.vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:25:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1910014548',display_name='tempest-TestNetworkAdvancedServerOps-server-1910014548',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1910014548',id=122,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGcjNWQLhorxyiCglISTL85sF/PebgHfK15yneJUghAfWhPSxNP3NydyYmhFfkO9o84fX5BcllBeB8dR7YwYFd3thDd5cmALiWGCn51055R0ZMgFMvAQxqZx7i5T53aIfQ==',key_name='tempest-TestNetworkAdvancedServerOps-954911067',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-axe4xfwn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:26Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.859 2 DEBUG nova.network.os_vif_util [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converting VIF {"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.860 2 DEBUG nova.network.os_vif_util [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.860 2 DEBUG os_vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.862 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.862 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap375468b9-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap375468b9-b2, col_values=(('external_ids', {'iface-id': '375468b9-b213-41ae-87ca-ea569359bdb6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:5f:dd', 'vm-uuid': '595aea98-0c3e-45c9-81fe-4643f44fe8d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.866 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.867 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.867 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 NetworkManager[51207]: <info>  [1759407954.8680] manager: (tap375468b9-b2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.888 2 INFO os_vif [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2')#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.888 2 DEBUG nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.889 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595aea98-0c3e-45c9-81fe-4643f44fe8d3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.923 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.924 2 DEBUG nova.virt.disk.api [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Checking if we can resize image /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.925 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.980 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.981 2 DEBUG nova.virt.disk.api [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Cannot resize image /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:25:54 np0005466012 nova_compute[192063]: 2025-10-02 12:25:54.982 2 DEBUG nova.objects.instance [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'migration_context' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:25:55 np0005466012 nova_compute[192063]: 2025-10-02 12:25:55.011 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:25:55 np0005466012 nova_compute[192063]: 2025-10-02 12:25:55.012 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Ensure instance console log exists: /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:25:55 np0005466012 nova_compute[192063]: 2025-10-02 12:25:55.013 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:25:55 np0005466012 nova_compute[192063]: 2025-10-02 12:25:55.013 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:25:55 np0005466012 nova_compute[192063]: 2025-10-02 12:25:55.014 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:25:55 np0005466012 nova_compute[192063]: 2025-10-02 12:25:55.678 2 DEBUG nova.network.neutron [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Successfully created port: 23e93cfb-aa99-4427-8af6-c199677a54ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.446 2 DEBUG nova.network.neutron [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Port 375468b9-b213-41ae-87ca-ea569359bdb6 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.485 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdyrz5icw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595aea98-0c3e-45c9-81fe-4643f44fe8d3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  2 08:25:57 np0005466012 systemd[1]: Starting libvirt proxy daemon...
Oct  2 08:25:57 np0005466012 systemd[1]: Started libvirt proxy daemon.
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.790 2 DEBUG nova.network.neutron [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Successfully updated port: 23e93cfb-aa99-4427-8af6-c199677a54ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.940 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.941 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.942 2 DEBUG nova.network.neutron [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:25:57 np0005466012 kernel: tap375468b9-b2: entered promiscuous mode
Oct  2 08:25:57 np0005466012 NetworkManager[51207]: <info>  [1759407957.9614] manager: (tap375468b9-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Oct  2 08:25:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:57Z|00454|binding|INFO|Claiming lport 375468b9-b213-41ae-87ca-ea569359bdb6 for this additional chassis.
Oct  2 08:25:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:57Z|00455|binding|INFO|375468b9-b213-41ae-87ca-ea569359bdb6: Claiming fa:16:3e:06:5f:dd 10.100.0.6
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.978 2 DEBUG nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-changed-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.979 2 DEBUG nova.compute.manager [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Refreshing instance network info cache due to event network-changed-23e93cfb-aa99-4427-8af6-c199677a54ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.979 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:25:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:25:57Z|00456|binding|INFO|Setting lport 375468b9-b213-41ae-87ca-ea569359bdb6 ovn-installed in OVS
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:57 np0005466012 nova_compute[192063]: 2025-10-02 12:25:57.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:58 np0005466012 systemd-udevd[238647]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:25:58 np0005466012 systemd-machined[152114]: New machine qemu-56-instance-0000007a.
Oct  2 08:25:58 np0005466012 NetworkManager[51207]: <info>  [1759407958.0318] device (tap375468b9-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:25:58 np0005466012 NetworkManager[51207]: <info>  [1759407958.0330] device (tap375468b9-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:25:58 np0005466012 systemd[1]: Started Virtual Machine qemu-56-instance-0000007a.
Oct  2 08:25:58 np0005466012 nova_compute[192063]: 2025-10-02 12:25:58.428 2 DEBUG nova.network.neutron [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.272 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407959.2718844, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.272 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Started (Lifecycle Event)#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.401 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.758 2 DEBUG nova.network.neutron [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.913 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.913 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance network_info: |[{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.914 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.914 2 DEBUG nova.network.neutron [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Refreshing network info cache for port 23e93cfb-aa99-4427-8af6-c199677a54ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.917 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Start _get_guest_xml network_info=[{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.921 2 WARNING nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.926 2 DEBUG nova.virt.libvirt.host [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.928 2 DEBUG nova.virt.libvirt.host [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.931 2 DEBUG nova.virt.libvirt.host [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.932 2 DEBUG nova.virt.libvirt.host [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.933 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.933 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.934 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.934 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.934 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.934 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.934 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.935 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.935 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.936 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.936 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.936 2 DEBUG nova.virt.hardware [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.939 2 DEBUG nova.virt.libvirt.vif [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-934202400',display_name='tempest-ServerStableDeviceRescueTest-server-934202400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-934202400',id=124,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-hf1wm990',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:54Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=de7f4178-00ba-409b-81ad-f6096e9ed144,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.940 2 DEBUG nova.network.os_vif_util [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.940 2 DEBUG nova.network.os_vif_util [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:25:59 np0005466012 nova_compute[192063]: 2025-10-02 12:25:59.941 2 DEBUG nova.objects.instance [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'pci_devices' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.114 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <uuid>de7f4178-00ba-409b-81ad-f6096e9ed144</uuid>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <name>instance-0000007c</name>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-934202400</nova:name>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:25:59</nova:creationTime>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:user uuid="abb9f220716e48d79dbe2f97622937c4">tempest-ServerStableDeviceRescueTest-232864240-project-member</nova:user>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:project uuid="88e90c16adec46069b539d4f1431ab4d">tempest-ServerStableDeviceRescueTest-232864240</nova:project>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        <nova:port uuid="23e93cfb-aa99-4427-8af6-c199677a54ec">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <entry name="serial">de7f4178-00ba-409b-81ad-f6096e9ed144</entry>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <entry name="uuid">de7f4178-00ba-409b-81ad-f6096e9ed144</entry>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:ca:a5:17"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <target dev="tap23e93cfb-aa"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/console.log" append="off"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:26:00 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:26:00 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:26:00 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:26:00 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.115 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Preparing to wait for external event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.115 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.116 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.116 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.117 2 DEBUG nova.virt.libvirt.vif [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-934202400',display_name='tempest-ServerStableDeviceRescueTest-server-934202400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-934202400',id=124,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-hf1wm990',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:25:54Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=de7f4178-00ba-409b-81ad-f6096e9ed144,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.117 2 DEBUG nova.network.os_vif_util [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.118 2 DEBUG nova.network.os_vif_util [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.118 2 DEBUG os_vif [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.119 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.120 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:00 np0005466012 NetworkManager[51207]: <info>  [1759407960.1264] manager: (tap23e93cfb-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.123 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23e93cfb-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.123 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap23e93cfb-aa, col_values=(('external_ids', {'iface-id': '23e93cfb-aa99-4427-8af6-c199677a54ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:a5:17', 'vm-uuid': 'de7f4178-00ba-409b-81ad-f6096e9ed144'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.131 2 INFO os_vif [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa')#033[00m
Oct  2 08:26:00 np0005466012 podman[238677]: 2025-10-02 12:26:00.152610475 +0000 UTC m=+0.061312789 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:26:00 np0005466012 podman[238678]: 2025-10-02 12:26:00.179500362 +0000 UTC m=+0.090970665 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.396 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407960.395799, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.396 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.426 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.427 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.427 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No VIF found with MAC fa:16:3e:ca:a5:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.427 2 INFO nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Using config drive#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.443 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.446 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:00 np0005466012 nova_compute[192063]: 2025-10-02 12:26:00.578 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.081 2 INFO nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Creating config drive at /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.091 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjqm3nl5f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.222 2 DEBUG oslo_concurrency.processutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjqm3nl5f" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:01 np0005466012 kernel: tap23e93cfb-aa: entered promiscuous mode
Oct  2 08:26:01 np0005466012 NetworkManager[51207]: <info>  [1759407961.2975] manager: (tap23e93cfb-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:01Z|00457|binding|INFO|Claiming lport 23e93cfb-aa99-4427-8af6-c199677a54ec for this chassis.
Oct  2 08:26:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:01Z|00458|binding|INFO|23e93cfb-aa99-4427-8af6-c199677a54ec: Claiming fa:16:3e:ca:a5:17 10.100.0.13
Oct  2 08:26:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:01Z|00459|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec ovn-installed in OVS
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 systemd-machined[152114]: New machine qemu-57-instance-0000007c.
Oct  2 08:26:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:01Z|00460|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec up in Southbound
Oct  2 08:26:01 np0005466012 systemd[1]: Started Virtual Machine qemu-57-instance-0000007c.
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.347 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:a5:17 10.100.0.13'], port_security=['fa:16:3e:ca:a5:17 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=23e93cfb-aa99-4427-8af6-c199677a54ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.348 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 23e93cfb-aa99-4427-8af6-c199677a54ec in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.350 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:26:01 np0005466012 systemd-udevd[238748]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.362 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[29d7647b-f2bd-49dd-9441-d8eeebec50d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.363 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaa4ebb90-e1 in ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:26:01 np0005466012 NetworkManager[51207]: <info>  [1759407961.3660] device (tap23e93cfb-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:26:01 np0005466012 NetworkManager[51207]: <info>  [1759407961.3668] device (tap23e93cfb-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.368 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaa4ebb90-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.368 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c991317e-da72-4718-98f7-6a7d07833935]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.370 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[28c15d29-4c67-4c05-842b-81cd5ca8457f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.385 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[b03840eb-22c6-4631-95b9-7b3c3ff856e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.409 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6b58a07b-d7c6-45bf-8fd9-5c3c1c2d42d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.438 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8117d0-d563-42e1-9a96-b0810a4bac07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.445 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0c1bfb-93b0-4061-871d-aa416bec2148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 NetworkManager[51207]: <info>  [1759407961.4461] manager: (tapaa4ebb90-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.482 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdc27f3-5f40-4895-bdc0-80ce2446cb4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.486 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6103f200-be9a-4834-9289-e97582bd6d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 NetworkManager[51207]: <info>  [1759407961.5170] device (tapaa4ebb90-e0): carrier: link connected
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.521 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[226b4ac4-174e-4e0f-b7fb-2d387ea9de96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.538 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[95d7afa7-886a-4d8c-8e6f-ced8c75cdd26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595512, 'reachable_time': 22095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238781, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.554 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[76cd5538-957b-45d6-9ff0-1744f6ecdd02]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:898e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595512, 'tstamp': 595512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238782, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.571 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6214e141-0590-4896-8bf3-919410afce3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595512, 'reachable_time': 22095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238783, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.603 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a7527e-7217-4456-b4d8-45111af53076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.661 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2f133fe7-51e5-49a3-a1fd-f3a5cfeab164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.662 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.662 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.662 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 NetworkManager[51207]: <info>  [1759407961.6649] manager: (tapaa4ebb90-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Oct  2 08:26:01 np0005466012 kernel: tapaa4ebb90-e0: entered promiscuous mode
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.668 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:01Z|00461|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.670 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.680 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8a90aa90-472f-4982-956e-9e2f5a584afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.681 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:26:01 np0005466012 nova_compute[192063]: 2025-10-02 12:26:01.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:26:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:01.683 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'env', 'PROCESS_TAG=haproxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aa4ebb90-ef5e-4974-a53d-2aabd696731a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:02Z|00462|binding|INFO|Claiming lport 375468b9-b213-41ae-87ca-ea569359bdb6 for this chassis.
Oct  2 08:26:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:02Z|00463|binding|INFO|375468b9-b213-41ae-87ca-ea569359bdb6: Claiming fa:16:3e:06:5f:dd 10.100.0.6
Oct  2 08:26:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:02Z|00464|binding|INFO|Setting lport 375468b9-b213-41ae-87ca-ea569359bdb6 up in Southbound
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.043 2 DEBUG nova.network.neutron [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updated VIF entry in instance network info cache for port 23e93cfb-aa99-4427-8af6-c199677a54ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.044 2 DEBUG nova.network.neutron [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.084 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:5f:dd 10.100.0.6'], port_security=['fa:16:3e:06:5f:dd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '595aea98-0c3e-45c9-81fe-4643f44fe8d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7af61307-f367-4334-ad00-5d542cb00bd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'db364350-0b47-4c18-8ab1-bf862406804b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e947ec6-847e-4b20-b912-5e8f3559dfc4, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=375468b9-b213-41ae-87ca-ea569359bdb6) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:02 np0005466012 podman[238821]: 2025-10-02 12:26:02.112066133 +0000 UTC m=+0.057021608 container create bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.127 2 DEBUG oslo_concurrency.lockutils [req-8fc704b8-7206-4c04-b8dd-37ae77230bb3 req-71051413-239b-4f7f-9735-66b14d1239dc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.135 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.136 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.137 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:02 np0005466012 systemd[1]: Started libpod-conmon-bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d.scope.
Oct  2 08:26:02 np0005466012 podman[238821]: 2025-10-02 12:26:02.084218488 +0000 UTC m=+0.029173983 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.184 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407962.183779, de7f4178-00ba-409b-81ad-f6096e9ed144 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.185 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Started (Lifecycle Event)#033[00m
Oct  2 08:26:02 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:26:02 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4435793fb3cf8586cfcbcd08ff3d7dbb714a9c6d85e7af6d60dd0d1b32dbdf7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:26:02 np0005466012 podman[238821]: 2025-10-02 12:26:02.20980098 +0000 UTC m=+0.154756485 container init bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:26:02 np0005466012 podman[238821]: 2025-10-02 12:26:02.216786018 +0000 UTC m=+0.161741493 container start bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:26:02 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [NOTICE]   (238840) : New worker (238842) forked
Oct  2 08:26:02 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [NOTICE]   (238840) : Loading success.
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.264 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 375468b9-b213-41ae-87ca-ea569359bdb6 in datapath 7af61307-f367-4334-ad00-5d542cb00bd9 unbound from our chassis#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.266 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7af61307-f367-4334-ad00-5d542cb00bd9#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.277 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a39225d6-a8d6-438d-b81c-a3c691e1d286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.278 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7af61307-f1 in ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.280 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7af61307-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.280 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c21358-30d4-4148-817d-73b13612e2d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.281 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[517adaaa-b245-47ea-bb5b-1ecd53ebe914]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.293 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[55aedb2a-ae11-456c-ac0c-79f71db74466]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.325 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c09ce3-d9be-4e4a-8253-7dffc1a50560]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.331 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.335 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407962.1848786, de7f4178-00ba-409b-81ad-f6096e9ed144 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.335 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.360 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e002c0f4-0f3a-495e-b24b-cca9cdd78f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 systemd-udevd[238766]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:26:02 np0005466012 NetworkManager[51207]: <info>  [1759407962.3685] manager: (tap7af61307-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.366 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb79e55-fe53-4263-9330-d13245c66d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.413 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ce195d4b-7ebc-4161-b20d-109bef9ba400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.415 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0e6c28-5e14-47f8-96ba-c204412d32ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 NetworkManager[51207]: <info>  [1759407962.4384] device (tap7af61307-f0): carrier: link connected
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.444 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd3bc57-7667-4c81-90dc-bbce4cee21a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.463 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[74a0644e-7d86-4985-a7c6-a8cc432695b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7af61307-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:b0:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595605, 'reachable_time': 39097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238861, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.476 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbbfd75-c496-4619-8aa8-df37273c55df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:b066'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595605, 'tstamp': 595605}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238862, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.492 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5d206c49-a25f-44a0-8ec0-ede6e3b928a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7af61307-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:b0:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595605, 'reachable_time': 39097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238863, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.519 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e169d6-cd63-4f03-9f16-5a759cf6211e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.542 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.544 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.576 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[07ca1200-9bdb-49fa-866c-7672d2b472e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.577 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7af61307-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.578 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.578 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7af61307-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466012 NetworkManager[51207]: <info>  [1759407962.5805] manager: (tap7af61307-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Oct  2 08:26:02 np0005466012 kernel: tap7af61307-f0: entered promiscuous mode
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.584 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7af61307-f0, col_values=(('external_ids', {'iface-id': '2ec12fcd-269a-49bb-95a9-094f7c676163'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:02Z|00465|binding|INFO|Releasing lport 2ec12fcd-269a-49bb-95a9-094f7c676163 from this chassis (sb_readonly=0)
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.588 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7af61307-f367-4334-ad00-5d542cb00bd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7af61307-f367-4334-ad00-5d542cb00bd9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.589 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2631901b-827d-4a65-b57a-f00d42a82fb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.589 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-7af61307-f367-4334-ad00-5d542cb00bd9
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/7af61307-f367-4334-ad00-5d542cb00bd9.pid.haproxy
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 7af61307-f367-4334-ad00-5d542cb00bd9
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:26:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:02.590 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'env', 'PROCESS_TAG=haproxy-7af61307-f367-4334-ad00-5d542cb00bd9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7af61307-f367-4334-ad00-5d542cb00bd9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:02 np0005466012 nova_compute[192063]: 2025-10-02 12:26:02.749 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:26:02 np0005466012 podman[238892]: 2025-10-02 12:26:02.988544256 +0000 UTC m=+0.048687199 container create d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.020 2 INFO nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Post operation of migration started#033[00m
Oct  2 08:26:03 np0005466012 systemd[1]: Started libpod-conmon-d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba.scope.
Oct  2 08:26:03 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:26:03 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fea0a985f4eead6c54b242e5bc45823831787d93187f4b5542632127002b227/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:26:03 np0005466012 podman[238892]: 2025-10-02 12:26:03.060141858 +0000 UTC m=+0.120284821 container init d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:26:03 np0005466012 podman[238892]: 2025-10-02 12:26:02.965222461 +0000 UTC m=+0.025365414 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:26:03 np0005466012 podman[238892]: 2025-10-02 12:26:03.065251544 +0000 UTC m=+0.125394487 container start d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:26:03 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [NOTICE]   (238911) : New worker (238913) forked
Oct  2 08:26:03 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [NOTICE]   (238911) : Loading success.
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.264 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.264 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.265 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.265 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.265 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Processing event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.266 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.266 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.266 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.266 2 DEBUG oslo_concurrency.lockutils [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.267 2 DEBUG nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.267 2 WARNING nova.compute.manager [req-98c519fa-fe55-4f82-b4a8-d56951c1feb1 req-41b4717e-f36f-4f7f-82ea-01c2a1b55faf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.268 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.272 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759407963.271622, de7f4178-00ba-409b-81ad-f6096e9ed144 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.272 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.274 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.277 2 INFO nova.virt.libvirt.driver [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance spawned successfully.#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.277 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.361 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.367 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.368 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.368 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.368 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.369 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.369 2 DEBUG nova.virt.libvirt.driver [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.374 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.486 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.682 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.682 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.683 2 DEBUG nova.network.neutron [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.990 2 INFO nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Took 9.33 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:26:03 np0005466012 nova_compute[192063]: 2025-10-02 12:26:03.991 2 DEBUG nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:04 np0005466012 podman[238922]: 2025-10-02 12:26:04.151041256 +0000 UTC m=+0.058427236 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  2 08:26:04 np0005466012 podman[238923]: 2025-10-02 12:26:04.155670238 +0000 UTC m=+0.059660952 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:26:04 np0005466012 nova_compute[192063]: 2025-10-02 12:26:04.672 2 INFO nova.compute.manager [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Took 10.77 seconds to build instance.#033[00m
Oct  2 08:26:04 np0005466012 nova_compute[192063]: 2025-10-02 12:26:04.787 2 DEBUG oslo_concurrency.lockutils [None req-b4aa01a9-88ad-485f-bdd4-30273af3b3f8 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.962s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.317 2 DEBUG nova.network.neutron [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.413 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.983 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.984 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.984 2 DEBUG oslo_concurrency.lockutils [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:05 np0005466012 nova_compute[192063]: 2025-10-02 12:26:05.988 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  2 08:26:05 np0005466012 virtqemud[191783]: Domain id=56 name='instance-0000007a' uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3 is tainted: custom-monitor
Oct  2 08:26:06 np0005466012 nova_compute[192063]: 2025-10-02 12:26:06.995 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  2 08:26:07 np0005466012 nova_compute[192063]: 2025-10-02 12:26:07.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:08 np0005466012 nova_compute[192063]: 2025-10-02 12:26:08.001 2 INFO nova.virt.libvirt.driver [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  2 08:26:08 np0005466012 nova_compute[192063]: 2025-10-02 12:26:08.005 2 DEBUG nova.compute.manager [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:08 np0005466012 nova_compute[192063]: 2025-10-02 12:26:08.081 2 DEBUG nova.objects.instance [None req-8e688ca8-988a-4380-a19b-cd8c72a2d5ea cb6cc43e566f47b68009374580e995a6 d31edba5aba7481a916ca3252d1375a4 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  2 08:26:10 np0005466012 nova_compute[192063]: 2025-10-02 12:26:10.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:12 np0005466012 nova_compute[192063]: 2025-10-02 12:26:12.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:12 np0005466012 podman[238961]: 2025-10-02 12:26:12.143039661 +0000 UTC m=+0.053815885 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:26:12 np0005466012 podman[238960]: 2025-10-02 12:26:12.150117194 +0000 UTC m=+0.061907907 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  2 08:26:13 np0005466012 nova_compute[192063]: 2025-10-02 12:26:13.098 2 DEBUG nova.compute.manager [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:14 np0005466012 nova_compute[192063]: 2025-10-02 12:26:14.754 2 INFO nova.compute.manager [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] instance snapshotting#033[00m
Oct  2 08:26:15 np0005466012 nova_compute[192063]: 2025-10-02 12:26:15.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:15 np0005466012 podman[239012]: 2025-10-02 12:26:15.150806493 +0000 UTC m=+0.065332594 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:26:15 np0005466012 podman[239013]: 2025-10-02 12:26:15.160770467 +0000 UTC m=+0.072196840 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:26:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:16Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ca:a5:17 10.100.0.13
Oct  2 08:26:16 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:16Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ca:a5:17 10.100.0.13
Oct  2 08:26:17 np0005466012 nova_compute[192063]: 2025-10-02 12:26:17.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:18 np0005466012 nova_compute[192063]: 2025-10-02 12:26:18.224 2 INFO nova.compute.manager [None req-5587088a-a9da-492f-b0ae-15da806e76e8 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Get console output#033[00m
Oct  2 08:26:18 np0005466012 nova_compute[192063]: 2025-10-02 12:26:18.229 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:26:18 np0005466012 nova_compute[192063]: 2025-10-02 12:26:18.883 2 INFO nova.virt.libvirt.driver [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Beginning live snapshot process#033[00m
Oct  2 08:26:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:19.872 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:19.873 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:26:19 np0005466012 nova_compute[192063]: 2025-10-02 12:26:19.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:20 np0005466012 virtqemud[191783]: invalid argument: disk vda does not have an active block job
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.107 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.173 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.175 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.233 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.254 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.345 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.346 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpjvw8geyw/242180be0d4348c8ad300ecee5dae1fe.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.394 2 DEBUG nova.compute.manager [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.395 2 DEBUG nova.compute.manager [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing instance network info cache due to event network-changed-375468b9-b213-41ae-87ca-ea569359bdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.396 2 DEBUG oslo_concurrency.lockutils [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.397 2 DEBUG oslo_concurrency.lockutils [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.397 2 DEBUG nova.network.neutron [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Refreshing network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.398 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpjvw8geyw/242180be0d4348c8ad300ecee5dae1fe.delta 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.400 2 INFO nova.virt.libvirt.driver [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.459 2 DEBUG nova.virt.libvirt.guest [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.962 2 DEBUG nova.virt.libvirt.guest [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:26:20 np0005466012 nova_compute[192063]: 2025-10-02 12:26:20.966 2 INFO nova.virt.libvirt.driver [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.005 2 DEBUG nova.privsep.utils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.006 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpjvw8geyw/242180be0d4348c8ad300ecee5dae1fe.delta /var/lib/nova/instances/snapshots/tmpjvw8geyw/242180be0d4348c8ad300ecee5dae1fe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.090 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.090 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.091 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.091 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.091 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.402 2 INFO nova.compute.manager [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Terminating instance#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.568 2 DEBUG nova.compute.manager [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:26:21 np0005466012 kernel: tap375468b9-b2 (unregistering): left promiscuous mode
Oct  2 08:26:21 np0005466012 NetworkManager[51207]: <info>  [1759407981.5945] device (tap375468b9-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.606 2 DEBUG oslo_concurrency.processutils [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpjvw8geyw/242180be0d4348c8ad300ecee5dae1fe.delta /var/lib/nova/instances/snapshots/tmpjvw8geyw/242180be0d4348c8ad300ecee5dae1fe" returned: 0 in 0.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:21Z|00466|binding|INFO|Releasing lport 375468b9-b213-41ae-87ca-ea569359bdb6 from this chassis (sb_readonly=0)
Oct  2 08:26:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:21Z|00467|binding|INFO|Setting lport 375468b9-b213-41ae-87ca-ea569359bdb6 down in Southbound
Oct  2 08:26:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:21Z|00468|binding|INFO|Removing iface tap375468b9-b2 ovn-installed in OVS
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.616 2 INFO nova.virt.libvirt.driver [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:21.653 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:5f:dd 10.100.0.6'], port_security=['fa:16:3e:06:5f:dd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '595aea98-0c3e-45c9-81fe-4643f44fe8d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7af61307-f367-4334-ad00-5d542cb00bd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'db364350-0b47-4c18-8ab1-bf862406804b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e947ec6-847e-4b20-b912-5e8f3559dfc4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=375468b9-b213-41ae-87ca-ea569359bdb6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:21.654 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 375468b9-b213-41ae-87ca-ea569359bdb6 in datapath 7af61307-f367-4334-ad00-5d542cb00bd9 unbound from our chassis#033[00m
Oct  2 08:26:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:21.656 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7af61307-f367-4334-ad00-5d542cb00bd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:26:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:21.657 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0462920d-9915-4aa5-bb72-7704b9a37bd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:21.658 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 namespace which is not needed anymore#033[00m
Oct  2 08:26:21 np0005466012 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Oct  2 08:26:21 np0005466012 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Consumed 2.520s CPU time.
Oct  2 08:26:21 np0005466012 systemd-machined[152114]: Machine qemu-56-instance-0000007a terminated.
Oct  2 08:26:21 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [NOTICE]   (238911) : haproxy version is 2.8.14-c23fe91
Oct  2 08:26:21 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [NOTICE]   (238911) : path to executable is /usr/sbin/haproxy
Oct  2 08:26:21 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [WARNING]  (238911) : Exiting Master process...
Oct  2 08:26:21 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [WARNING]  (238911) : Exiting Master process...
Oct  2 08:26:21 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [ALERT]    (238911) : Current worker (238913) exited with code 143 (Terminated)
Oct  2 08:26:21 np0005466012 neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9[238907]: [WARNING]  (238911) : All workers exited. Exiting... (0)
Oct  2 08:26:21 np0005466012 systemd[1]: libpod-d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba.scope: Deactivated successfully.
Oct  2 08:26:21 np0005466012 podman[239109]: 2025-10-02 12:26:21.792054878 +0000 UTC m=+0.045168279 container died d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:26:21 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba-userdata-shm.mount: Deactivated successfully.
Oct  2 08:26:21 np0005466012 systemd[1]: var-lib-containers-storage-overlay-6fea0a985f4eead6c54b242e5bc45823831787d93187f4b5542632127002b227-merged.mount: Deactivated successfully.
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.834 2 INFO nova.virt.libvirt.driver [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Instance destroyed successfully.#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.835 2 DEBUG nova.objects.instance [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 595aea98-0c3e-45c9-81fe-4643f44fe8d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.855 2 DEBUG nova.virt.libvirt.vif [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T12:25:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1910014548',display_name='tempest-TestNetworkAdvancedServerOps-server-1910014548',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1910014548',id=122,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGcjNWQLhorxyiCglISTL85sF/PebgHfK15yneJUghAfWhPSxNP3NydyYmhFfkO9o84fX5BcllBeB8dR7YwYFd3thDd5cmALiWGCn51055R0ZMgFMvAQxqZx7i5T53aIfQ==',key_name='tempest-TestNetworkAdvancedServerOps-954911067',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-axe4xfwn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:26:08Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=595aea98-0c3e-45c9-81fe-4643f44fe8d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.855 2 DEBUG nova.network.os_vif_util [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.856 2 DEBUG nova.network.os_vif_util [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.856 2 DEBUG os_vif [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap375468b9-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.867 2 INFO os_vif [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:5f:dd,bridge_name='br-int',has_traffic_filtering=True,id=375468b9-b213-41ae-87ca-ea569359bdb6,network=Network(7af61307-f367-4334-ad00-5d542cb00bd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap375468b9-b2')#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.868 2 INFO nova.virt.libvirt.driver [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Deleting instance files /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3_del#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.868 2 INFO nova.virt.libvirt.driver [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Deletion of /var/lib/nova/instances/595aea98-0c3e-45c9-81fe-4643f44fe8d3_del complete#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.959 2 INFO nova.compute.manager [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.966 2 DEBUG oslo.service.loopingcall [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.967 2 DEBUG nova.compute.manager [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:26:21 np0005466012 nova_compute[192063]: 2025-10-02 12:26:21.968 2 DEBUG nova.network.neutron [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:26:22 np0005466012 nova_compute[192063]: 2025-10-02 12:26:22.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:22 np0005466012 nova_compute[192063]: 2025-10-02 12:26:22.218 2 DEBUG nova.network.neutron [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updated VIF entry in instance network info cache for port 375468b9-b213-41ae-87ca-ea569359bdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:26:22 np0005466012 podman[239109]: 2025-10-02 12:26:22.219514047 +0000 UTC m=+0.472627428 container cleanup d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:26:22 np0005466012 nova_compute[192063]: 2025-10-02 12:26:22.219 2 DEBUG nova.network.neutron [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [{"id": "375468b9-b213-41ae-87ca-ea569359bdb6", "address": "fa:16:3e:06:5f:dd", "network": {"id": "7af61307-f367-4334-ad00-5d542cb00bd9", "bridge": "br-int", "label": "tempest-network-smoke--1240497721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap375468b9-b2", "ovs_interfaceid": "375468b9-b213-41ae-87ca-ea569359bdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:22 np0005466012 systemd[1]: libpod-conmon-d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba.scope: Deactivated successfully.
Oct  2 08:26:22 np0005466012 nova_compute[192063]: 2025-10-02 12:26:22.305 2 DEBUG oslo_concurrency.lockutils [req-5d719d28-4fc5-4235-b9cf-6d821680def2 req-6bd4f11d-4dd1-4095-b7c8-b14b64cbca98 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-595aea98-0c3e-45c9-81fe-4643f44fe8d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:22 np0005466012 podman[239161]: 2025-10-02 12:26:22.495623451 +0000 UTC m=+0.245614715 container remove d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.503 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3314ca-106d-4c49-a5c7-d6a6a0e87d3f]: (4, ('Thu Oct  2 12:26:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 (d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba)\nd204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba\nThu Oct  2 12:26:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 (d204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba)\nd204e66b9ce2914768f620f11f473011575bc513dd63cf75677d94a4653fd6ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.505 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6538d2f6-8d6a-4711-9e8f-33ab304974c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.506 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7af61307-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:22 np0005466012 kernel: tap7af61307-f0: left promiscuous mode
Oct  2 08:26:22 np0005466012 nova_compute[192063]: 2025-10-02 12:26:22.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:22 np0005466012 nova_compute[192063]: 2025-10-02 12:26:22.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.530 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0ed5c2-ba9c-4e9d-a08a-29a2a62861e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.560 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[650c54dc-eca7-4a7e-b308-c8c61a76eb7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.561 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d31b4759-6b27-4459-8af1-c9b684ed521d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.580 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[59696458-714c-49fd-b0e3-1fdb5fabc425]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595596, 'reachable_time': 21645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239176, 'error': None, 'target': 'ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.584 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7af61307-f367-4334-ad00-5d542cb00bd9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:26:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:22.584 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[646d23b9-394d-4d5c-bab8-1c5e2c0d8ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:22 np0005466012 systemd[1]: run-netns-ovnmeta\x2d7af61307\x2df367\x2d4334\x2dad00\x2d5d542cb00bd9.mount: Deactivated successfully.
Oct  2 08:26:23 np0005466012 nova_compute[192063]: 2025-10-02 12:26:23.011 2 DEBUG nova.network.neutron [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:23 np0005466012 nova_compute[192063]: 2025-10-02 12:26:23.033 2 INFO nova.compute.manager [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Took 1.07 seconds to deallocate network for instance.#033[00m
Oct  2 08:26:23 np0005466012 nova_compute[192063]: 2025-10-02 12:26:23.136 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:23 np0005466012 nova_compute[192063]: 2025-10-02 12:26:23.137 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:23 np0005466012 nova_compute[192063]: 2025-10-02 12:26:23.155 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:23 np0005466012 nova_compute[192063]: 2025-10-02 12:26:23.562 2 INFO nova.scheduler.client.report [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Deleted allocations for instance 595aea98-0c3e-45c9-81fe-4643f44fe8d3#033[00m
Oct  2 08:26:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:23.875 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.459 2 DEBUG nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.460 2 DEBUG oslo_concurrency.lockutils [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.460 2 DEBUG oslo_concurrency.lockutils [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.460 2 DEBUG oslo_concurrency.lockutils [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.461 2 DEBUG nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.461 2 WARNING nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-unplugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.461 2 DEBUG nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.462 2 DEBUG oslo_concurrency.lockutils [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.462 2 DEBUG oslo_concurrency.lockutils [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.462 2 DEBUG oslo_concurrency.lockutils [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.462 2 DEBUG nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] No waiting events found dispatching network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.463 2 WARNING nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received unexpected event network-vif-plugged-375468b9-b213-41ae-87ca-ea569359bdb6 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:26:25 np0005466012 nova_compute[192063]: 2025-10-02 12:26:25.463 2 DEBUG nova.compute.manager [req-c12d7363-2c48-43a8-b7cf-b657413ebd97 req-2c5d61ed-90f3-4bae-8fcb-f97a1bccf697 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Received event network-vif-deleted-375468b9-b213-41ae-87ca-ea569359bdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.049 2 DEBUG oslo_concurrency.lockutils [None req-f0160f9c-7096-43ee-b136-1bcfa9d01461 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "595aea98-0c3e-45c9-81fe-4643f44fe8d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.959s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.350 2 INFO nova.virt.libvirt.driver [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Snapshot image upload complete#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.351 2 INFO nova.compute.manager [None req-a199dc7c-75e7-4ce5-9db1-8a27dabc05e7 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Took 8.12 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.445 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.445 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.446 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.446 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.446 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.467 2 INFO nova.compute.manager [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Terminating instance#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.490 2 DEBUG nova.compute.manager [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:26:26 np0005466012 kernel: tap622b7e17-6a (unregistering): left promiscuous mode
Oct  2 08:26:26 np0005466012 NetworkManager[51207]: <info>  [1759407986.5164] device (tap622b7e17-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:26Z|00469|binding|INFO|Releasing lport 622b7e17-6a86-4876-8e8a-6b40367f483e from this chassis (sb_readonly=0)
Oct  2 08:26:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:26Z|00470|binding|INFO|Setting lport 622b7e17-6a86-4876-8e8a-6b40367f483e down in Southbound
Oct  2 08:26:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:26Z|00471|binding|INFO|Removing iface tap622b7e17-6a ovn-installed in OVS
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.543 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:63:1b 10.100.0.5 2001:db8::f816:3eff:febc:631b'], port_security=['fa:16:3e:bc:63:1b 10.100.0.5 2001:db8::f816:3eff:febc:631b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8::f816:3eff:febc:631b/64', 'neutron:device_id': '97dd79e2-9bf5-47c4-8c3f-fa70335c3d37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26df2dcf-f57c-4dae-8522-0277df741ed3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2c57d713-64e3-4621-a624-32092d283319', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2784fb0-50ac-4c91-ba90-3b5c38b8adf4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=622b7e17-6a86-4876-8e8a-6b40367f483e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.546 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 622b7e17-6a86-4876-8e8a-6b40367f483e in datapath 26df2dcf-f57c-4dae-8522-0277df741ed3 unbound from our chassis#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.550 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 26df2dcf-f57c-4dae-8522-0277df741ed3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.553 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8e72a6c0-338e-45a9-a399-80cb1c820767]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.554 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3 namespace which is not needed anymore#033[00m
Oct  2 08:26:26 np0005466012 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000078.scope: Deactivated successfully.
Oct  2 08:26:26 np0005466012 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000078.scope: Consumed 16.973s CPU time.
Oct  2 08:26:26 np0005466012 systemd-machined[152114]: Machine qemu-55-instance-00000078 terminated.
Oct  2 08:26:26 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [NOTICE]   (238353) : haproxy version is 2.8.14-c23fe91
Oct  2 08:26:26 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [NOTICE]   (238353) : path to executable is /usr/sbin/haproxy
Oct  2 08:26:26 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [WARNING]  (238353) : Exiting Master process...
Oct  2 08:26:26 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [WARNING]  (238353) : Exiting Master process...
Oct  2 08:26:26 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [ALERT]    (238353) : Current worker (238359) exited with code 143 (Terminated)
Oct  2 08:26:26 np0005466012 neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3[238325]: [WARNING]  (238353) : All workers exited. Exiting... (0)
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 systemd[1]: libpod-307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9.scope: Deactivated successfully.
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 podman[239200]: 2025-10-02 12:26:26.726007847 +0000 UTC m=+0.071002196 container died 307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:26:26 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9-userdata-shm.mount: Deactivated successfully.
Oct  2 08:26:26 np0005466012 systemd[1]: var-lib-containers-storage-overlay-5bb00d145ae25c06a3fcc867eafb665e3abc7da065d3e2415c3841c7e929eaee-merged.mount: Deactivated successfully.
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.762 2 INFO nova.virt.libvirt.driver [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Instance destroyed successfully.#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.764 2 DEBUG nova.objects.instance [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:26 np0005466012 podman[239200]: 2025-10-02 12:26:26.765116852 +0000 UTC m=+0.110111201 container cleanup 307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:26:26 np0005466012 systemd[1]: libpod-conmon-307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9.scope: Deactivated successfully.
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.791 2 DEBUG nova.virt.libvirt.vif [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1098429742',display_name='tempest-TestGettingAddress-server-1098429742',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1098429742',id=120,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN27qqZO7DS6SotTIkgadWOrlyFzalcMBya6l3P3FHA92Trdk8QzNk/bIfeVZHQyyH9bzXdJACR3sdrkH4czxiQm1W3dnbgCG/vLQtAxveP29c1TkzsAJfjG23nfB+bI6Q==',key_name='tempest-TestGettingAddress-794970227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:25:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-2b2xc028',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:25:13Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=97dd79e2-9bf5-47c4-8c3f-fa70335c3d37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.791 2 DEBUG nova.network.os_vif_util [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.792 2 DEBUG nova.network.os_vif_util [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.793 2 DEBUG os_vif [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.795 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap622b7e17-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.802 2 INFO os_vif [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:63:1b,bridge_name='br-int',has_traffic_filtering=True,id=622b7e17-6a86-4876-8e8a-6b40367f483e,network=Network(26df2dcf-f57c-4dae-8522-0277df741ed3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap622b7e17-6a')#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.802 2 INFO nova.virt.libvirt.driver [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Deleting instance files /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37_del#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.803 2 INFO nova.virt.libvirt.driver [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Deletion of /var/lib/nova/instances/97dd79e2-9bf5-47c4-8c3f-fa70335c3d37_del complete#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:26 np0005466012 podman[239244]: 2025-10-02 12:26:26.823320622 +0000 UTC m=+0.037992045 container remove 307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.828 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[06227b4e-fa6c-4631-9654-ae4a3e6adea6]: (4, ('Thu Oct  2 12:26:26 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3 (307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9)\n307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9\nThu Oct  2 12:26:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3 (307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9)\n307d1a180bd8dbf5a098fc1cb0e4d05b498583fd2b753c4196cfbbd31dc453d9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.830 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[750b5634-c774-45cc-974c-023e6c70a1f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.830 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26df2dcf-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:26 np0005466012 kernel: tap26df2dcf-f0: left promiscuous mode
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 nova_compute[192063]: 2025-10-02 12:26:26.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.847 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b66eecef-1a68-49eb-9d23-dfc98dec713e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.875 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cddd8f62-6b33-477e-9972-00424e71fb4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.880 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab54ea5-7ad5-4468-b80e-e65d759c4e42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.901 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[66914d78-a125-4759-93ca-35ec4e759e73]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590663, 'reachable_time': 30132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239259, 'error': None, 'target': 'ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.907 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-26df2dcf-f57c-4dae-8522-0277df741ed3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:26:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:26.907 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[29c43929-531b-47bb-9529-4427820f5fc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:26 np0005466012 systemd[1]: run-netns-ovnmeta\x2d26df2dcf\x2df57c\x2d4dae\x2d8522\x2d0277df741ed3.mount: Deactivated successfully.
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.008 2 INFO nova.compute.manager [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.010 2 DEBUG oslo.service.loopingcall [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.010 2 DEBUG nova.compute.manager [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.011 2 DEBUG nova.network.neutron [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.765 2 DEBUG nova.compute.manager [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-changed-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.766 2 DEBUG nova.compute.manager [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Refreshing instance network info cache due to event network-changed-622b7e17-6a86-4876-8e8a-6b40367f483e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.767 2 DEBUG oslo_concurrency.lockutils [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.767 2 DEBUG oslo_concurrency.lockutils [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:27 np0005466012 nova_compute[192063]: 2025-10-02 12:26:27.768 2 DEBUG nova.network.neutron [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Refreshing network info cache for port 622b7e17-6a86-4876-8e8a-6b40367f483e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:26:28 np0005466012 nova_compute[192063]: 2025-10-02 12:26:28.034 2 DEBUG nova.compute.manager [req-d7528eca-704f-4e8c-9063-3dbd93e0213d req-78332421-e173-4605-824e-441dca5bf153 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-vif-unplugged-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:28 np0005466012 nova_compute[192063]: 2025-10-02 12:26:28.034 2 DEBUG oslo_concurrency.lockutils [req-d7528eca-704f-4e8c-9063-3dbd93e0213d req-78332421-e173-4605-824e-441dca5bf153 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:28 np0005466012 nova_compute[192063]: 2025-10-02 12:26:28.035 2 DEBUG oslo_concurrency.lockutils [req-d7528eca-704f-4e8c-9063-3dbd93e0213d req-78332421-e173-4605-824e-441dca5bf153 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:28 np0005466012 nova_compute[192063]: 2025-10-02 12:26:28.035 2 DEBUG oslo_concurrency.lockutils [req-d7528eca-704f-4e8c-9063-3dbd93e0213d req-78332421-e173-4605-824e-441dca5bf153 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:28 np0005466012 nova_compute[192063]: 2025-10-02 12:26:28.035 2 DEBUG nova.compute.manager [req-d7528eca-704f-4e8c-9063-3dbd93e0213d req-78332421-e173-4605-824e-441dca5bf153 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] No waiting events found dispatching network-vif-unplugged-622b7e17-6a86-4876-8e8a-6b40367f483e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:28 np0005466012 nova_compute[192063]: 2025-10-02 12:26:28.035 2 DEBUG nova.compute.manager [req-d7528eca-704f-4e8c-9063-3dbd93e0213d req-78332421-e173-4605-824e-441dca5bf153 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-vif-unplugged-622b7e17-6a86-4876-8e8a-6b40367f483e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:26:29 np0005466012 nova_compute[192063]: 2025-10-02 12:26:29.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:30 np0005466012 nova_compute[192063]: 2025-10-02 12:26:30.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:30 np0005466012 nova_compute[192063]: 2025-10-02 12:26:30.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.061 2 INFO nova.compute.manager [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Rescuing#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.062 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.062 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.062 2 DEBUG nova.network.neutron [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.097 2 DEBUG nova.compute.manager [req-53df0e8e-dea2-4ff7-baac-137d0461edea req-6611dde2-8fc2-47fe-ad5d-418e4a2970c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.097 2 DEBUG oslo_concurrency.lockutils [req-53df0e8e-dea2-4ff7-baac-137d0461edea req-6611dde2-8fc2-47fe-ad5d-418e4a2970c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.098 2 DEBUG oslo_concurrency.lockutils [req-53df0e8e-dea2-4ff7-baac-137d0461edea req-6611dde2-8fc2-47fe-ad5d-418e4a2970c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.098 2 DEBUG oslo_concurrency.lockutils [req-53df0e8e-dea2-4ff7-baac-137d0461edea req-6611dde2-8fc2-47fe-ad5d-418e4a2970c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.098 2 DEBUG nova.compute.manager [req-53df0e8e-dea2-4ff7-baac-137d0461edea req-6611dde2-8fc2-47fe-ad5d-418e4a2970c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] No waiting events found dispatching network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.098 2 WARNING nova.compute.manager [req-53df0e8e-dea2-4ff7-baac-137d0461edea req-6611dde2-8fc2-47fe-ad5d-418e4a2970c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received unexpected event network-vif-plugged-622b7e17-6a86-4876-8e8a-6b40367f483e for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.142 2 DEBUG nova.network.neutron [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:31 np0005466012 podman[239260]: 2025-10-02 12:26:31.145510746 +0000 UTC m=+0.065968912 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:26:31 np0005466012 podman[239261]: 2025-10-02 12:26:31.189664665 +0000 UTC m=+0.105793957 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.191 2 INFO nova.compute.manager [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Took 4.18 seconds to deallocate network for instance.#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.460 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.461 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.499 2 DEBUG nova.scheduler.client.report [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.529 2 DEBUG nova.scheduler.client.report [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.530 2 DEBUG nova.compute.provider_tree [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.547 2 DEBUG nova.scheduler.client.report [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.596 2 DEBUG nova.scheduler.client.report [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.655 2 DEBUG nova.compute.provider_tree [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.694 2 DEBUG nova.scheduler.client.report [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.734 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.773 2 INFO nova.scheduler.client.report [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.875 2 DEBUG oslo_concurrency.lockutils [None req-07902022-c95c-4568-9285-1e0c57990a16 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.879 2 DEBUG nova.network.neutron [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updated VIF entry in instance network info cache for port 622b7e17-6a86-4876-8e8a-6b40367f483e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.879 2 DEBUG nova.network.neutron [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Updating instance_info_cache with network_info: [{"id": "622b7e17-6a86-4876-8e8a-6b40367f483e", "address": "fa:16:3e:bc:63:1b", "network": {"id": "26df2dcf-f57c-4dae-8522-0277df741ed3", "bridge": "br-int", "label": "tempest-network-smoke--1584637508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febc:631b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap622b7e17-6a", "ovs_interfaceid": "622b7e17-6a86-4876-8e8a-6b40367f483e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:31 np0005466012 nova_compute[192063]: 2025-10-02 12:26:31.916 2 DEBUG oslo_concurrency.lockutils [req-228b403e-308c-468f-9b29-039735811b25 req-c455a825-5757-417f-9eab-264042cce72c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-97dd79e2-9bf5-47c4-8c3f-fa70335c3d37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:32 np0005466012 nova_compute[192063]: 2025-10-02 12:26:32.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:33 np0005466012 nova_compute[192063]: 2025-10-02 12:26:33.768 2 DEBUG nova.compute.manager [req-83aab4e4-e7b4-4ec8-990a-2f8f7817a9a6 req-b1a85451-1fcb-481f-80a4-44b004b86261 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Received event network-vif-deleted-622b7e17-6a86-4876-8e8a-6b40367f483e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:33 np0005466012 nova_compute[192063]: 2025-10-02 12:26:33.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:33 np0005466012 nova_compute[192063]: 2025-10-02 12:26:33.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:33 np0005466012 nova_compute[192063]: 2025-10-02 12:26:33.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:26:34 np0005466012 nova_compute[192063]: 2025-10-02 12:26:34.176 2 DEBUG nova.network.neutron [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:34 np0005466012 nova_compute[192063]: 2025-10-02 12:26:34.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.079 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:35 np0005466012 podman[239312]: 2025-10-02 12:26:35.130596967 +0000 UTC m=+0.047927608 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:26:35 np0005466012 podman[239311]: 2025-10-02 12:26:35.137889185 +0000 UTC m=+0.058978003 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.392 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.869 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.869 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.869 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:35 np0005466012 nova_compute[192063]: 2025-10-02 12:26:35.869 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.027 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.092 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.094 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.149 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.290 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.291 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5526MB free_disk=73.28334045410156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.291 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.291 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.414 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance de7f4178-00ba-409b-81ad-f6096e9ed144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.414 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.414 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.461 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.502 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.544 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.544 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.831 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407981.8297899, 595aea98-0c3e-45c9-81fe-4643f44fe8d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.831 2 INFO nova.compute.manager [-] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:26:36 np0005466012 nova_compute[192063]: 2025-10-02 12:26:36.870 2 DEBUG nova.compute.manager [None req-861accea-58e6-4bb3-87a6-95fe1aed01a4 - - - - - -] [instance: 595aea98-0c3e-45c9-81fe-4643f44fe8d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:37Z|00472|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:37Z|00473|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.545 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:37 np0005466012 kernel: tap23e93cfb-aa (unregistering): left promiscuous mode
Oct  2 08:26:37 np0005466012 NetworkManager[51207]: <info>  [1759407997.5676] device (tap23e93cfb-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:26:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:37Z|00474|binding|INFO|Releasing lport 23e93cfb-aa99-4427-8af6-c199677a54ec from this chassis (sb_readonly=0)
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:37Z|00475|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec down in Southbound
Oct  2 08:26:37 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:37Z|00476|binding|INFO|Removing iface tap23e93cfb-aa ovn-installed in OVS
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.599 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:a5:17 10.100.0.13'], port_security=['fa:16:3e:ca:a5:17 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=23e93cfb-aa99-4427-8af6-c199677a54ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.601 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 23e93cfb-aa99-4427-8af6-c199677a54ec in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.603 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa4ebb90-ef5e-4974-a53d-2aabd696731a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.604 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1c126886-0ec0-4b57-b6b7-b6492cd81b22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.604 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace which is not needed anymore#033[00m
Oct  2 08:26:37 np0005466012 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Oct  2 08:26:37 np0005466012 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Consumed 14.653s CPU time.
Oct  2 08:26:37 np0005466012 systemd-machined[152114]: Machine qemu-57-instance-0000007c terminated.
Oct  2 08:26:37 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [NOTICE]   (238840) : haproxy version is 2.8.14-c23fe91
Oct  2 08:26:37 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [NOTICE]   (238840) : path to executable is /usr/sbin/haproxy
Oct  2 08:26:37 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [WARNING]  (238840) : Exiting Master process...
Oct  2 08:26:37 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [ALERT]    (238840) : Current worker (238842) exited with code 143 (Terminated)
Oct  2 08:26:37 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[238836]: [WARNING]  (238840) : All workers exited. Exiting... (0)
Oct  2 08:26:37 np0005466012 systemd[1]: libpod-bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d.scope: Deactivated successfully.
Oct  2 08:26:37 np0005466012 podman[239382]: 2025-10-02 12:26:37.748296816 +0000 UTC m=+0.049895555 container died bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:26:37 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:26:37 np0005466012 systemd[1]: var-lib-containers-storage-overlay-4435793fb3cf8586cfcbcd08ff3d7dbb714a9c6d85e7af6d60dd0d1b32dbdf7d-merged.mount: Deactivated successfully.
Oct  2 08:26:37 np0005466012 podman[239382]: 2025-10-02 12:26:37.786502304 +0000 UTC m=+0.088101043 container cleanup bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:26:37 np0005466012 systemd[1]: libpod-conmon-bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d.scope: Deactivated successfully.
Oct  2 08:26:37 np0005466012 podman[239412]: 2025-10-02 12:26:37.883617904 +0000 UTC m=+0.073989141 container remove bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.892 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ab09a27f-a0e1-4a84-924d-9dae3622deb8]: (4, ('Thu Oct  2 12:26:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d)\nbbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d\nThu Oct  2 12:26:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (bbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d)\nbbc9742ae91038d0e567e32e602c7617f2ea02dbb1ae80a41c8edce3208fe17d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.894 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe8500c-7695-442c-9062-f73932d6529f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.895 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 kernel: tapaa4ebb90-e0: left promiscuous mode
Oct  2 08:26:37 np0005466012 nova_compute[192063]: 2025-10-02 12:26:37.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.918 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[998bfcd6-3516-4f67-819b-647383a8ad19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.949 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[31ac7873-4134-48ea-93f9-41fc2bb0a891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.951 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b74027ef-6763-4c9d-98f7-4b2d6095fd80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.964 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6fddbc25-cc41-4e98-affb-3d4c97be79ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595504, 'reachable_time': 33697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239447, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.969 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:26:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:37.970 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[85424cbd-edfd-4518-b15e-4cfb9d855c55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:37 np0005466012 systemd[1]: run-netns-ovnmeta\x2daa4ebb90\x2def5e\x2d4974\x2da53d\x2d2aabd696731a.mount: Deactivated successfully.
Oct  2 08:26:38 np0005466012 nova_compute[192063]: 2025-10-02 12:26:38.408 2 INFO nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:26:38 np0005466012 nova_compute[192063]: 2025-10-02 12:26:38.413 2 INFO nova.virt.libvirt.driver [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance destroyed successfully.#033[00m
Oct  2 08:26:38 np0005466012 nova_compute[192063]: 2025-10-02 12:26:38.413 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'numa_topology' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:41 np0005466012 nova_compute[192063]: 2025-10-02 12:26:41.762 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407986.7605166, 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:41 np0005466012 nova_compute[192063]: 2025-10-02 12:26:41.762 2 INFO nova.compute.manager [-] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:26:41 np0005466012 nova_compute[192063]: 2025-10-02 12:26:41.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.243 2 INFO nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Attempting a stable device rescue#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.667 2 DEBUG nova.compute.manager [req-83b4d88d-2c41-477f-8619-634b7a6b6947 req-f5742659-24a9-4ce1-958e-d7c90c65f894 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.668 2 DEBUG oslo_concurrency.lockutils [req-83b4d88d-2c41-477f-8619-634b7a6b6947 req-f5742659-24a9-4ce1-958e-d7c90c65f894 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.668 2 DEBUG oslo_concurrency.lockutils [req-83b4d88d-2c41-477f-8619-634b7a6b6947 req-f5742659-24a9-4ce1-958e-d7c90c65f894 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.668 2 DEBUG oslo_concurrency.lockutils [req-83b4d88d-2c41-477f-8619-634b7a6b6947 req-f5742659-24a9-4ce1-958e-d7c90c65f894 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.668 2 DEBUG nova.compute.manager [req-83b4d88d-2c41-477f-8619-634b7a6b6947 req-f5742659-24a9-4ce1-958e-d7c90c65f894 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.669 2 WARNING nova.compute.manager [req-83b4d88d-2c41-477f-8619-634b7a6b6947 req-f5742659-24a9-4ce1-958e-d7c90c65f894 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.681 2 DEBUG nova.compute.manager [None req-592e7047-6b7c-449c-926c-5f2dd7fc57c6 - - - - - -] [instance: 97dd79e2-9bf5-47c4-8c3f-fa70335c3d37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:26:42 np0005466012 nova_compute[192063]: 2025-10-02 12:26:42.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:26:43 np0005466012 podman[239448]: 2025-10-02 12:26:43.148837154 +0000 UTC m=+0.063908982 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  2 08:26:43 np0005466012 podman[239449]: 2025-10-02 12:26:43.179216002 +0000 UTC m=+0.094416775 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=)
Oct  2 08:26:43 np0005466012 nova_compute[192063]: 2025-10-02 12:26:43.349 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:43 np0005466012 nova_compute[192063]: 2025-10-02 12:26:43.350 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:43 np0005466012 nova_compute[192063]: 2025-10-02 12:26:43.351 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:26:43 np0005466012 nova_compute[192063]: 2025-10-02 12:26:43.351 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:46 np0005466012 podman[239486]: 2025-10-02 12:26:46.159365877 +0000 UTC m=+0.075976836 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct  2 08:26:46 np0005466012 podman[239487]: 2025-10-02 12:26:46.167650825 +0000 UTC m=+0.083244296 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:26:46 np0005466012 nova_compute[192063]: 2025-10-02 12:26:46.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:47 np0005466012 nova_compute[192063]: 2025-10-02 12:26:47.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.272 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.278 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.279 2 INFO nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Creating image(s)#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.280 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.280 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.281 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.281 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'trusted_certs' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.323 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "1e3186a2e8f789d59bb3974b363889023224d3d8" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.324 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "1e3186a2e8f789d59bb3974b363889023224d3d8" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.746 2 DEBUG nova.compute.manager [req-9af824d5-f78b-41c8-b849-a642ec065d65 req-47099a3d-bb6b-4c2b-84cf-c4d255cb465c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.747 2 DEBUG oslo_concurrency.lockutils [req-9af824d5-f78b-41c8-b849-a642ec065d65 req-47099a3d-bb6b-4c2b-84cf-c4d255cb465c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.747 2 DEBUG oslo_concurrency.lockutils [req-9af824d5-f78b-41c8-b849-a642ec065d65 req-47099a3d-bb6b-4c2b-84cf-c4d255cb465c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.748 2 DEBUG oslo_concurrency.lockutils [req-9af824d5-f78b-41c8-b849-a642ec065d65 req-47099a3d-bb6b-4c2b-84cf-c4d255cb465c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.748 2 DEBUG nova.compute.manager [req-9af824d5-f78b-41c8-b849-a642ec065d65 req-47099a3d-bb6b-4c2b-84cf-c4d255cb465c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:48 np0005466012 nova_compute[192063]: 2025-10-02 12:26:48.749 2 WARNING nova.compute.manager [req-9af824d5-f78b-41c8-b849-a642ec065d65 req-47099a3d-bb6b-4c2b-84cf-c4d255cb465c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state rescuing.#033[00m
Oct  2 08:26:50 np0005466012 nova_compute[192063]: 2025-10-02 12:26:50.520 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:26:50 np0005466012 nova_compute[192063]: 2025-10-02 12:26:50.790 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:26:50 np0005466012 nova_compute[192063]: 2025-10-02 12:26:50.791 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.501 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.563 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.part --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.564 2 DEBUG nova.virt.images [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] c999e5c3-e7c0-4dcf-b6a7-7651e0e24a3c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.566 2 DEBUG nova.privsep.utils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.567 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.part /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.825 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.part /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.converted" returned: 0 in 0.258s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.835 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.907 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8.converted --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.909 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "1e3186a2e8f789d59bb3974b363889023224d3d8" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.936 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "1e3186a2e8f789d59bb3974b363889023224d3d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.938 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "1e3186a2e8f789d59bb3974b363889023224d3d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:51 np0005466012 nova_compute[192063]: 2025-10-02 12:26:51.953 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.024 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.026 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8,backing_fmt=raw /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.087 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8,backing_fmt=raw /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.rescue" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.089 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "1e3186a2e8f789d59bb3974b363889023224d3d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.090 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'migration_context' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.212 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.218 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Start _get_guest_xml network_info=[{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:ca:a5:17"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'c999e5c3-e7c0-4dcf-b6a7-7651e0e24a3c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.260 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'resources' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.317 2 WARNING nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.324 2 DEBUG nova.virt.libvirt.host [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.325 2 DEBUG nova.virt.libvirt.host [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.329 2 DEBUG nova.virt.libvirt.host [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.330 2 DEBUG nova.virt.libvirt.host [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.331 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.332 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.332 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.332 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.333 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.333 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.333 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.333 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.334 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.334 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.334 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.334 2 DEBUG nova.virt.hardware [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.335 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'vcpu_model' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.380 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.476 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.477 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.478 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.479 2 DEBUG oslo_concurrency.lockutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.480 2 DEBUG nova.virt.libvirt.vif [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-934202400',display_name='tempest-ServerStableDeviceRescueTest-server-934202400',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-934202400',id=124,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:26:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-hf1wm990',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:26:26Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=de7f4178-00ba-409b-81ad-f6096e9ed144,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:ca:a5:17"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.481 2 DEBUG nova.network.os_vif_util [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "vif_mac": "fa:16:3e:ca:a5:17"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.482 2 DEBUG nova.network.os_vif_util [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.483 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'pci_devices' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.532 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <uuid>de7f4178-00ba-409b-81ad-f6096e9ed144</uuid>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <name>instance-0000007c</name>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-934202400</nova:name>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:26:52</nova:creationTime>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:user uuid="abb9f220716e48d79dbe2f97622937c4">tempest-ServerStableDeviceRescueTest-232864240-project-member</nova:user>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:project uuid="88e90c16adec46069b539d4f1431ab4d">tempest-ServerStableDeviceRescueTest-232864240</nova:project>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        <nova:port uuid="23e93cfb-aa99-4427-8af6-c199677a54ec">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <entry name="serial">de7f4178-00ba-409b-81ad-f6096e9ed144</entry>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <entry name="uuid">de7f4178-00ba-409b-81ad-f6096e9ed144</entry>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.rescue"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <target dev="sdb" bus="usb"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <boot order="1"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:ca:a5:17"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <target dev="tap23e93cfb-aa"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/console.log" append="off"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:26:52 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:26:52 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:26:52 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:26:52 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.542 2 INFO nova.virt.libvirt.driver [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance destroyed successfully.#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.625 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.626 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.626 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.626 2 DEBUG nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] No VIF found with MAC fa:16:3e:ca:a5:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.627 2 INFO nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Using config drive#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.647 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'ec2_ids' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.695 2 DEBUG nova.objects.instance [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'keypairs' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.862 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759407997.861187, de7f4178-00ba-409b-81ad-f6096e9ed144 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.863 2 INFO nova.compute.manager [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.883 2 DEBUG nova.compute.manager [None req-8cb48914-e4f0-4281-a675-ea7d8aff742a - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.888 2 DEBUG nova.compute.manager [None req-8cb48914-e4f0-4281-a675-ea7d8aff742a - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:52 np0005466012 nova_compute[192063]: 2025-10-02 12:26:52.912 2 INFO nova.compute.manager [None req-8cb48914-e4f0-4281-a675-ea7d8aff742a - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:26:53 np0005466012 nova_compute[192063]: 2025-10-02 12:26:53.495 2 INFO nova.virt.libvirt.driver [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Creating config drive at /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config.rescue#033[00m
Oct  2 08:26:53 np0005466012 nova_compute[192063]: 2025-10-02 12:26:53.506 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpouuj5cts execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:26:53 np0005466012 nova_compute[192063]: 2025-10-02 12:26:53.660 2 DEBUG oslo_concurrency.processutils [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpouuj5cts" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:26:53 np0005466012 kernel: tap23e93cfb-aa: entered promiscuous mode
Oct  2 08:26:53 np0005466012 NetworkManager[51207]: <info>  [1759408013.7523] manager: (tap23e93cfb-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Oct  2 08:26:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:53Z|00477|binding|INFO|Claiming lport 23e93cfb-aa99-4427-8af6-c199677a54ec for this chassis.
Oct  2 08:26:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:53Z|00478|binding|INFO|23e93cfb-aa99-4427-8af6-c199677a54ec: Claiming fa:16:3e:ca:a5:17 10.100.0.13
Oct  2 08:26:53 np0005466012 nova_compute[192063]: 2025-10-02 12:26:53.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:53Z|00479|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec ovn-installed in OVS
Oct  2 08:26:53 np0005466012 nova_compute[192063]: 2025-10-02 12:26:53.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:53 np0005466012 nova_compute[192063]: 2025-10-02 12:26:53.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:53 np0005466012 systemd-udevd[239568]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:26:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:53Z|00480|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec up in Southbound
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.791 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:a5:17 10.100.0.13'], port_security=['fa:16:3e:ca:a5:17 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=23e93cfb-aa99-4427-8af6-c199677a54ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.793 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 23e93cfb-aa99-4427-8af6-c199677a54ec in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a bound to our chassis#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.795 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:26:53 np0005466012 NetworkManager[51207]: <info>  [1759408013.8011] device (tap23e93cfb-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:26:53 np0005466012 NetworkManager[51207]: <info>  [1759408013.8022] device (tap23e93cfb-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:26:53 np0005466012 systemd-machined[152114]: New machine qemu-58-instance-0000007c.
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.818 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b74ff8-c776-4bb1-a6be-21414d5febab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.819 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaa4ebb90-e1 in ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.821 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaa4ebb90-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.821 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c35def-2951-4cd5-9244-c4ccc271192c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.823 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[27765993-a362-4018-8157-1834fdecbf8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 systemd[1]: Started Virtual Machine qemu-58-instance-0000007c.
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.859 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[266560ee-9c35-475c-922f-dd51c0e44667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.879 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5774eb4a-1696-4697-a707-065e2dee649b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.926 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0389372f-f636-4b29-96b5-2d5093f3390b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.932 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[54845892-162d-4268-aa8c-b1cef6044f97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 NetworkManager[51207]: <info>  [1759408013.9339] manager: (tapaa4ebb90-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/214)
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.967 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2aaee8be-e32b-4c11-ac6b-672c55f3888d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:53.971 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[90f3271e-2eeb-40ce-b707-3489eea38f3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:53 np0005466012 NetworkManager[51207]: <info>  [1759408013.9939] device (tapaa4ebb90-e0): carrier: link connected
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.001 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[78ff9177-c476-47cc-bf70-b96fce89c8b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.024 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5b87932d-b3fc-4edd-97a4-8976b6e12730]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600760, 'reachable_time': 39678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239604, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.040 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f84ac97a-2b24-49aa-9341-96d43e088062]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:898e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600760, 'tstamp': 600760}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239605, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.059 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3a76ff58-ecf5-4fac-b15a-003d1323bdd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600760, 'reachable_time': 39678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239606, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.084 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[40f59e5c-5ebf-4430-a653-6171770463b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.154 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7304752c-3d97-4b9b-b575-044ddd2ef24a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.156 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.157 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.158 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005466012 NetworkManager[51207]: <info>  [1759408014.1609] manager: (tapaa4ebb90-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Oct  2 08:26:54 np0005466012 kernel: tapaa4ebb90-e0: entered promiscuous mode
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.164 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:26:54Z|00481|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.187 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.188 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d97eeb-72d4-4c81-b981-6a6cf18cd65c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.189 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:26:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:26:54.190 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'env', 'PROCESS_TAG=haproxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aa4ebb90-ef5e-4974-a53d-2aabd696731a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:26:54 np0005466012 podman[239645]: 2025-10-02 12:26:54.577112714 +0000 UTC m=+0.060105418 container create a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  2 08:26:54 np0005466012 systemd[1]: Started libpod-conmon-a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb.scope.
Oct  2 08:26:54 np0005466012 podman[239645]: 2025-10-02 12:26:54.549998646 +0000 UTC m=+0.032991380 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:26:54 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:26:54 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdc873afd7053dfed696b1075b152c48a816a25a74904cf34cfff0c31082da69/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:26:54 np0005466012 podman[239645]: 2025-10-02 12:26:54.66983095 +0000 UTC m=+0.152823704 container init a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:26:54 np0005466012 podman[239645]: 2025-10-02 12:26:54.675137426 +0000 UTC m=+0.158130140 container start a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:26:54 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [NOTICE]   (239664) : New worker (239666) forked
Oct  2 08:26:54 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [NOTICE]   (239664) : Loading success.
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.760 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408014.760087, de7f4178-00ba-409b-81ad-f6096e9ed144 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.761 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:26:54 np0005466012 nova_compute[192063]: 2025-10-02 12:26:54.952 2 DEBUG nova.compute.manager [None req-e26bcb9f-4b21-444f-9584-41f0fff98823 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.012 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.016 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.074 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.074 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408014.7612352, de7f4178-00ba-409b-81ad-f6096e9ed144 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.075 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Started (Lifecycle Event)#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.162 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:26:55 np0005466012 nova_compute[192063]: 2025-10-02 12:26:55.167 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.824 2 DEBUG nova.compute.manager [req-7b94502b-3e4d-4fb2-b9dd-5e142a99bad8 req-411d30b7-d01a-44ab-8d56-893f20b3fa58 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.824 2 DEBUG oslo_concurrency.lockutils [req-7b94502b-3e4d-4fb2-b9dd-5e142a99bad8 req-411d30b7-d01a-44ab-8d56-893f20b3fa58 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.824 2 DEBUG oslo_concurrency.lockutils [req-7b94502b-3e4d-4fb2-b9dd-5e142a99bad8 req-411d30b7-d01a-44ab-8d56-893f20b3fa58 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.824 2 DEBUG oslo_concurrency.lockutils [req-7b94502b-3e4d-4fb2-b9dd-5e142a99bad8 req-411d30b7-d01a-44ab-8d56-893f20b3fa58 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.825 2 DEBUG nova.compute.manager [req-7b94502b-3e4d-4fb2-b9dd-5e142a99bad8 req-411d30b7-d01a-44ab-8d56-893f20b3fa58 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:56 np0005466012 nova_compute[192063]: 2025-10-02 12:26:56.825 2 WARNING nova.compute.manager [req-7b94502b-3e4d-4fb2-b9dd-5e142a99bad8 req-411d30b7-d01a-44ab-8d56-893f20b3fa58 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state rescued and task_state None.#033[00m
Oct  2 08:26:57 np0005466012 nova_compute[192063]: 2025-10-02 12:26:57.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:26:58 np0005466012 nova_compute[192063]: 2025-10-02 12:26:58.733 2 INFO nova.compute.manager [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Unrescuing#033[00m
Oct  2 08:26:58 np0005466012 nova_compute[192063]: 2025-10-02 12:26:58.734 2 DEBUG oslo_concurrency.lockutils [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:26:58 np0005466012 nova_compute[192063]: 2025-10-02 12:26:58.734 2 DEBUG oslo_concurrency.lockutils [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquired lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:26:58 np0005466012 nova_compute[192063]: 2025-10-02 12:26:58.735 2 DEBUG nova.network.neutron [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:26:59 np0005466012 nova_compute[192063]: 2025-10-02 12:26:59.157 2 DEBUG nova.compute.manager [req-09f96041-9ffe-438b-9a70-1b01115c3607 req-d440931f-31f4-40df-81a9-7ba7e6ccfa68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:26:59 np0005466012 nova_compute[192063]: 2025-10-02 12:26:59.158 2 DEBUG oslo_concurrency.lockutils [req-09f96041-9ffe-438b-9a70-1b01115c3607 req-d440931f-31f4-40df-81a9-7ba7e6ccfa68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:26:59 np0005466012 nova_compute[192063]: 2025-10-02 12:26:59.158 2 DEBUG oslo_concurrency.lockutils [req-09f96041-9ffe-438b-9a70-1b01115c3607 req-d440931f-31f4-40df-81a9-7ba7e6ccfa68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:26:59 np0005466012 nova_compute[192063]: 2025-10-02 12:26:59.159 2 DEBUG oslo_concurrency.lockutils [req-09f96041-9ffe-438b-9a70-1b01115c3607 req-d440931f-31f4-40df-81a9-7ba7e6ccfa68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:26:59 np0005466012 nova_compute[192063]: 2025-10-02 12:26:59.159 2 DEBUG nova.compute.manager [req-09f96041-9ffe-438b-9a70-1b01115c3607 req-d440931f-31f4-40df-81a9-7ba7e6ccfa68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:26:59 np0005466012 nova_compute[192063]: 2025-10-02 12:26:59.159 2 WARNING nova.compute.manager [req-09f96041-9ffe-438b-9a70-1b01115c3607 req-d440931f-31f4-40df-81a9-7ba7e6ccfa68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state rescued and task_state unrescuing.#033[00m
Oct  2 08:27:01 np0005466012 nova_compute[192063]: 2025-10-02 12:27:01.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:02.136 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:02.137 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:02.137 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:02 np0005466012 podman[239675]: 2025-10-02 12:27:02.168575101 +0000 UTC m=+0.070642408 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:27:02 np0005466012 podman[239676]: 2025-10-02 12:27:02.232139213 +0000 UTC m=+0.133170991 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller)
Oct  2 08:27:02 np0005466012 nova_compute[192063]: 2025-10-02 12:27:02.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.267 2 DEBUG nova.network.neutron [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.379 2 DEBUG oslo_concurrency.lockutils [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Releasing lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.380 2 DEBUG nova.objects.instance [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'flavor' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:04 np0005466012 kernel: tap23e93cfb-aa (unregistering): left promiscuous mode
Oct  2 08:27:04 np0005466012 NetworkManager[51207]: <info>  [1759408024.4614] device (tap23e93cfb-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00482|binding|INFO|Releasing lport 23e93cfb-aa99-4427-8af6-c199677a54ec from this chassis (sb_readonly=0)
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00483|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec down in Southbound
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00484|binding|INFO|Removing iface tap23e93cfb-aa ovn-installed in OVS
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.512 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:a5:17 10.100.0.13'], port_security=['fa:16:3e:ca:a5:17 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=23e93cfb-aa99-4427-8af6-c199677a54ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.513 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 23e93cfb-aa99-4427-8af6-c199677a54ec in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.515 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa4ebb90-ef5e-4974-a53d-2aabd696731a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.516 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f2dae9-c4b8-4910-80c3-e07bb0612b05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.517 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace which is not needed anymore#033[00m
Oct  2 08:27:04 np0005466012 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Oct  2 08:27:04 np0005466012 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007c.scope: Consumed 10.720s CPU time.
Oct  2 08:27:04 np0005466012 systemd-machined[152114]: Machine qemu-58-instance-0000007c terminated.
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [NOTICE]   (239664) : haproxy version is 2.8.14-c23fe91
Oct  2 08:27:04 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [NOTICE]   (239664) : path to executable is /usr/sbin/haproxy
Oct  2 08:27:04 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [WARNING]  (239664) : Exiting Master process...
Oct  2 08:27:04 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [ALERT]    (239664) : Current worker (239666) exited with code 143 (Terminated)
Oct  2 08:27:04 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239660]: [WARNING]  (239664) : All workers exited. Exiting... (0)
Oct  2 08:27:04 np0005466012 systemd[1]: libpod-a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb.scope: Deactivated successfully.
Oct  2 08:27:04 np0005466012 conmon[239660]: conmon a25b2db7390e1179a64f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb.scope/container/memory.events
Oct  2 08:27:04 np0005466012 podman[239750]: 2025-10-02 12:27:04.717116228 +0000 UTC m=+0.107909726 container died a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:27:04 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb-userdata-shm.mount: Deactivated successfully.
Oct  2 08:27:04 np0005466012 systemd[1]: var-lib-containers-storage-overlay-bdc873afd7053dfed696b1075b152c48a816a25a74904cf34cfff0c31082da69-merged.mount: Deactivated successfully.
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.759 2 INFO nova.virt.libvirt.driver [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance destroyed successfully.#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.759 2 DEBUG nova.objects.instance [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'numa_topology' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:04 np0005466012 podman[239750]: 2025-10-02 12:27:04.764956617 +0000 UTC m=+0.155750085 container cleanup a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:27:04 np0005466012 systemd[1]: libpod-conmon-a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb.scope: Deactivated successfully.
Oct  2 08:27:04 np0005466012 podman[239797]: 2025-10-02 12:27:04.832113169 +0000 UTC m=+0.043873611 container remove a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.839 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c81a652e-4224-49eb-9409-9c2c6959faea]: (4, ('Thu Oct  2 12:27:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb)\na25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb\nThu Oct  2 12:27:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (a25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb)\na25b2db7390e1179a64fe9d145b43ab2eb449af9c3377a5b9f04575a6c94f7cb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.841 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e45e9738-eeb6-41b5-a610-d7130bebe08b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.842 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 kernel: tapaa4ebb90-e0: left promiscuous mode
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.862 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b30ea087-6b85-48ce-80dc-a76c317edfcd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 kernel: tap23e93cfb-aa: entered promiscuous mode
Oct  2 08:27:04 np0005466012 systemd-udevd[239728]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:27:04 np0005466012 NetworkManager[51207]: <info>  [1759408024.8874] manager: (tap23e93cfb-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00485|binding|INFO|Claiming lport 23e93cfb-aa99-4427-8af6-c199677a54ec for this chassis.
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00486|binding|INFO|23e93cfb-aa99-4427-8af6-c199677a54ec: Claiming fa:16:3e:ca:a5:17 10.100.0.13
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 NetworkManager[51207]: <info>  [1759408024.9002] device (tap23e93cfb-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:27:04 np0005466012 NetworkManager[51207]: <info>  [1759408024.9013] device (tap23e93cfb-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.901 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:a5:17 10.100.0.13'], port_security=['fa:16:3e:ca:a5:17 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=23e93cfb-aa99-4427-8af6-c199677a54ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00487|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec ovn-installed in OVS
Oct  2 08:27:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:04Z|00488|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec up in Southbound
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.906 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f17928-c366-40bb-809b-937e1923e576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.907 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[faef4313-413d-4e1f-990f-16134fe3e888]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 nova_compute[192063]: 2025-10-02 12:27:04.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.924 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1e64ec-367d-4d63-83d0-0bfbf45240d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600752, 'reachable_time': 41991, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239828, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 systemd[1]: run-netns-ovnmeta\x2daa4ebb90\x2def5e\x2d4974\x2da53d\x2d2aabd696731a.mount: Deactivated successfully.
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.929 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.929 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a59381-5678-46b6-95fd-c9bd01377b97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.930 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 23e93cfb-aa99-4427-8af6-c199677a54ec in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:27:04 np0005466012 systemd-machined[152114]: New machine qemu-59-instance-0000007c.
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.938 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aa4ebb90-ef5e-4974-a53d-2aabd696731a#033[00m
Oct  2 08:27:04 np0005466012 systemd[1]: Started Virtual Machine qemu-59-instance-0000007c.
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.950 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[db81bc44-216f-405b-8ab8-ea6b4d3396f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.952 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaa4ebb90-e1 in ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.953 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaa4ebb90-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.953 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6bce2be8-9fb5-48c7-acb9-ae5084a82949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.954 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0eaf587b-98cc-4819-859c-3d3ef83302a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:04.968 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[a393a2dd-6ae7-4f07-b67c-3f1643458240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.001 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[88113f39-c4c0-4c55-83e8-ef83d65fdd3c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.038 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[16abe38e-9a3f-48cb-9baa-9beb601ccb98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 NetworkManager[51207]: <info>  [1759408025.0488] manager: (tapaa4ebb90-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/217)
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.047 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[710b49cf-488a-4dad-a690-8eaf20fa6f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.085 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1982e5b3-c791-4707-9448-07c7390861ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.089 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[52da2d3c-93aa-49da-aafa-f244df7d9ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 NetworkManager[51207]: <info>  [1759408025.1128] device (tapaa4ebb90-e0): carrier: link connected
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.120 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1b72b790-5c13-48ed-b0a4-1bfcd2552f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.148 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4d97d464-d7a1-46d9-ad87-e27ffe88f587]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601872, 'reachable_time': 30599, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239861, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.169 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4b353786-386c-4e7e-8809-e16532506425]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:898e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601872, 'tstamp': 601872}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239867, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.194 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf0b73e-7d52-4799-a425-efe8424d3f45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaa4ebb90-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:89:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601872, 'reachable_time': 30599, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239869, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.234 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[787fb8d3-32e9-4e26-a203-34925d7750de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.318 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ba5aff28-1e8d-4d13-84e4-06535cdfc9f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.320 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.320 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.321 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa4ebb90-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005466012 NetworkManager[51207]: <info>  [1759408025.3248] manager: (tapaa4ebb90-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Oct  2 08:27:05 np0005466012 kernel: tapaa4ebb90-e0: entered promiscuous mode
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.330 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaa4ebb90-e0, col_values=(('external_ids', {'iface-id': 'c9a9afa9-78de-46a4-a649-8b8779924189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:05Z|00489|binding|INFO|Releasing lport c9a9afa9-78de-46a4-a649-8b8779924189 from this chassis (sb_readonly=0)
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.354 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.354 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[63c78173-b5b8-4d52-9d27-e9ccdb2e75ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.355 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/aa4ebb90-ef5e-4974-a53d-2aabd696731a.pid.haproxy
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID aa4ebb90-ef5e-4974-a53d-2aabd696731a
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:27:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:05.356 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'env', 'PROCESS_TAG=haproxy-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aa4ebb90-ef5e-4974-a53d-2aabd696731a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.703 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for de7f4178-00ba-409b-81ad-f6096e9ed144 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.705 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408025.7033346, de7f4178-00ba-409b-81ad-f6096e9ed144 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.705 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.720 2 DEBUG nova.compute.manager [None req-fa3e9859-61dd-463d-afca-db2b419493e3 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:05 np0005466012 podman[239902]: 2025-10-02 12:27:05.751482403 +0000 UTC m=+0.053302250 container create f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.763 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.766 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:27:05 np0005466012 systemd[1]: Started libpod-conmon-f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c.scope.
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.807 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.807 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408025.71948, de7f4178-00ba-409b-81ad-f6096e9ed144 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.808 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Started (Lifecycle Event)#033[00m
Oct  2 08:27:05 np0005466012 podman[239902]: 2025-10-02 12:27:05.729670302 +0000 UTC m=+0.031490169 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:27:05 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.828 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:27:05 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9492ea39d6991d47069935cca738bf884288ee8431e6232f59649ef975e0e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:27:05 np0005466012 nova_compute[192063]: 2025-10-02 12:27:05.835 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:27:05 np0005466012 podman[239902]: 2025-10-02 12:27:05.843405437 +0000 UTC m=+0.145225284 container init f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:27:05 np0005466012 podman[239902]: 2025-10-02 12:27:05.849176807 +0000 UTC m=+0.150996644 container start f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:27:05 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [NOTICE]   (239951) : New worker (239959) forked
Oct  2 08:27:05 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [NOTICE]   (239951) : Loading success.
Oct  2 08:27:05 np0005466012 podman[239917]: 2025-10-02 12:27:05.883317648 +0000 UTC m=+0.091981667 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:27:05 np0005466012 podman[239914]: 2025-10-02 12:27:05.902225339 +0000 UTC m=+0.102703083 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.344 2 DEBUG nova.compute.manager [req-007cab5b-67e9-4ff9-96a6-83f98b214ef1 req-a275fa2f-7d3d-49d3-814f-cf61ed942eb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.345 2 DEBUG oslo_concurrency.lockutils [req-007cab5b-67e9-4ff9-96a6-83f98b214ef1 req-a275fa2f-7d3d-49d3-814f-cf61ed942eb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.346 2 DEBUG oslo_concurrency.lockutils [req-007cab5b-67e9-4ff9-96a6-83f98b214ef1 req-a275fa2f-7d3d-49d3-814f-cf61ed942eb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.346 2 DEBUG oslo_concurrency.lockutils [req-007cab5b-67e9-4ff9-96a6-83f98b214ef1 req-a275fa2f-7d3d-49d3-814f-cf61ed942eb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.346 2 DEBUG nova.compute.manager [req-007cab5b-67e9-4ff9-96a6-83f98b214ef1 req-a275fa2f-7d3d-49d3-814f-cf61ed942eb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.346 2 WARNING nova.compute.manager [req-007cab5b-67e9-4ff9-96a6-83f98b214ef1 req-a275fa2f-7d3d-49d3-814f-cf61ed942eb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state None.#033[00m
Oct  2 08:27:06 np0005466012 nova_compute[192063]: 2025-10-02 12:27:06.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:07 np0005466012 nova_compute[192063]: 2025-10-02 12:27:07.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.488 2 DEBUG nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.489 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.490 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.490 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.491 2 DEBUG nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.491 2 WARNING nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state None.#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.492 2 DEBUG nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.492 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.493 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.493 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.494 2 DEBUG nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.495 2 WARNING nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state None.#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.495 2 DEBUG nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.496 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.496 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.497 2 DEBUG oslo_concurrency.lockutils [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.497 2 DEBUG nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:27:08 np0005466012 nova_compute[192063]: 2025-10-02 12:27:08.498 2 WARNING nova.compute.manager [req-bee3d89f-f65a-445a-ab8b-9726d6c4c749 req-aa4bdbcc-ac19-4129-8bf5-98a130aae0ca 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state None.#033[00m
Oct  2 08:27:11 np0005466012 nova_compute[192063]: 2025-10-02 12:27:11.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:12 np0005466012 nova_compute[192063]: 2025-10-02 12:27:12.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:14 np0005466012 podman[239970]: 2025-10-02 12:27:14.13700522 +0000 UTC m=+0.054164354 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64)
Oct  2 08:27:14 np0005466012 podman[239969]: 2025-10-02 12:27:14.1373636 +0000 UTC m=+0.056764086 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:27:16 np0005466012 nova_compute[192063]: 2025-10-02 12:27:16.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.925 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '88e90c16adec46069b539d4f1431ab4d', 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'hostId': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.929 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for de7f4178-00ba-409b-81ad-f6096e9ed144 / tap23e93cfb-aa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.929 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08b6af08-f799-4324-8984-a34af1ed1b9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:16.926520', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '22110f78-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': '5517900ffe34baf9967b36c35f17329341d03ca8920aa43e251a5f8765bf6731'}]}, 'timestamp': '2025-10-02 12:27:16.930118', '_unique_id': '462f19cf8a5e485194920299e54074ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.931 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.932 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30b3ab83-0784-40f5-8598-b0c4ab95e394', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:16.932663', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '22118534-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': 'b700562e0b935ab973af9e131664fb86bdcd62110cf35af859c8628d4e1744dd'}]}, 'timestamp': '2025-10-02 12:27:16.933126', '_unique_id': 'c2c48e8c7c974656ac56c0d89b9e9788'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.934 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.953 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.954 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e0d94b7-f102-4bf8-9f94-9e8894dfb070', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:16.935491', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2214c8d4-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': 'fa453d10365e75d4ad7f980c82cfa2b426d84e4e62386a570663caa22a74127f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:16.935491', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2214d798-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': 'ea6bcee2248d524b301ac5053c67a9f6ac4baca59db0eb446ebd0bcc5f45357d'}]}, 'timestamp': '2025-10-02 12:27:16.954855', '_unique_id': 'a4729ceb48644a3799285151e7b5b70c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.955 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.957 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.972 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/cpu volume: 10760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5f13aa4-ffbf-4b1f-998a-e4eb9182f752', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10760000000, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'timestamp': '2025-10-02T12:27:16.957174', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '22178e48-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.648288273, 'message_signature': 'a1382190f323d1bfe20d3cf5a5ff9e7328c14366695a7ed754536d66e0975aa4'}]}, 'timestamp': '2025-10-02 12:27:16.972675', '_unique_id': '18f74e2c541d4ed3aa0677c86d0f54da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.974 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.975 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>]
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.975 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.975 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/memory.usage volume: 40.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80d7a4e0-463c-40e2-9ec1-40312c02f2c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4140625, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'timestamp': '2025-10-02T12:27:16.975393', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2218077e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.648288273, 'message_signature': '69cb74539fab7fca31446450d51c5ca282a5d793ebb095a7a4a6107b2a7d19b7'}]}, 'timestamp': '2025-10-02 12:27:16.975772', '_unique_id': 'a3063bb1ca194b5a8da9ccb8f4e33f0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.976 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.977 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1b81053-8cf5-4f47-9f92-2b48f8a28e24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:16.977775', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '22186430-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': 'a76c5b88ecb4f3a7b31962bf6c12b3ca09ee3e7c2588ccffb8d15aa43579f721'}]}, 'timestamp': '2025-10-02 12:27:16.978128', '_unique_id': '9c749983ee214cb293a52a04ddc80a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.978 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.979 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.980 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.read.latency volume: 527186428 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.980 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.read.latency volume: 558476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19e8d639-60bf-4ddb-bd67-b469170cacc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 527186428, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:16.980046', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2218bd18-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': 'dceca5af84cda9fb656eb0ac74b2fdac6fbf6df24a60c2185ba581b4cf0df74c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 558476, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:16.980046', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2218c916-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': 'b71471423c30c224d20fc0c46e55d42a778c0ce3964bb1a6df787fde9e9164aa'}]}, 'timestamp': '2025-10-02 12:27:16.980721', '_unique_id': '95524bedbe4748f2a82b7a933bb41b15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.981 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.983 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.983 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>]
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.983 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.984 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5351dc3d-6334-4e7e-bb1a-6f4cbfa8fe1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:16.983653', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22194b84-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': 'c45c9899e919bc066843a80d7020de5e2e97f335dec03d804dab9c07d8f69d60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:16.983653', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '221958d6-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '3f38ebe3c9960f19305940bbbc673f3abcb065cc749fbf61f47c64f28a259c7c'}]}, 'timestamp': '2025-10-02 12:27:16.984395', '_unique_id': '6724ddd393bb4cc4b7567c193f36d80a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.987 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.987 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>]
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.987 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '325c64ab-4784-46d5-88e0-e830b19dbf0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:16.987533', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '2219e38c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': '667cdc465fde399ac17792d11c5c923a76a8fed30c0e9d5891959dce20ae2d1b'}]}, 'timestamp': '2025-10-02 12:27:16.987999', '_unique_id': '471a55eafee347cf8390805363167bf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.990 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa2608d3-57e7-4a67-8c97-61afb51ea5e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:16.990396', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '221a52d6-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': '52bf013c4f3b597fa12ab5e8c5c6e89f26e4956690d8e5f1322435eeba47051c'}]}, 'timestamp': '2025-10-02 12:27:16.990844', '_unique_id': '21ed2a5fc5c14fc8a51c2216637e274a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.993 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e37b22a-92f1-4f48-af39-0342a0cd6037', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:16.993361', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '221acfd6-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': '493a352a5064f123456f120193df0d74b612bf0badd76b7de0e6848f6aa53f64'}]}, 'timestamp': '2025-10-02 12:27:16.994078', '_unique_id': '3b6ff0125f81417791d885307fdf4077'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.008 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.009 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06ae1e9b-0368-4503-a99a-bb9d347e923c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:16.996939', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '221d21e6-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.673361044, 'message_signature': '74df7626c05e797412a8a43d7eae6f7223dd141a97b1dc378226630f69d41da1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:16.996939', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '221d321c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.673361044, 'message_signature': 'f770c6022eabbb21464314df8d224fd6372f0a6ef64023cbdc81d30b47902ab0'}]}, 'timestamp': '2025-10-02 12:27:17.009666', '_unique_id': '500b9b6ac8504c6b83322ad4e4307b1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.010 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.012 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.012 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13cbcbfc-a402-46fc-99ae-423922bde9ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:17.012119', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '221da058-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '6c1809d40031301fd5d82775aacdc038a919f3cdefb0522614cd4bc54e103303'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:17.012119', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '221da9c2-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '24a050db990754953b2160ba6b4e4ccb1b712f90377295ea217368bd705f27c2'}]}, 'timestamp': '2025-10-02 12:27:17.012625', '_unique_id': '6548ef57e16544adbceb2e00c1052df3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.014 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.014 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7c8177a-5853-4aac-af7e-1c91ab4a743a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:17.014586', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '221e0282-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.673361044, 'message_signature': 'f1b9e0f79a287ad38bf5966cb439614bbd67bf4d13ec3775b55afe5830e87af4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:17.014586', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '221e0d22-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.673361044, 'message_signature': '8227451afb943ac9e1ee0633b72edd587ebc76defbd13431c0cd47c8541d92dc'}]}, 'timestamp': '2025-10-02 12:27:17.015167', '_unique_id': '78d84c2795ad4bcf819df2845780a2a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.017 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9eaf356-76b9-4008-b5d9-078bb74fd202', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:17.017413', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '221e70fa-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': 'c1a5f719b60cf36caad1ba06adc3a9ee873901b91cdc342f08ad2e417c3ce728'}]}, 'timestamp': '2025-10-02 12:27:17.017755', '_unique_id': '65956887592d46039616710df20ef248'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.018 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.019 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f605554-e3d7-4c1a-b1e9-1d9d7f5ce47d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:17.019311', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '221ebbb4-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': 'e8566095a982bc15b6f571d04d088226f75165376bebef84551f1978f4985070'}]}, 'timestamp': '2025-10-02 12:27:17.019650', '_unique_id': 'fdb2a57cf5274ba88648842d3baee0eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.021 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.021 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910c8111-823b-4ee9-9a66-0c11e012c6b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:17.021154', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '221f0128-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.673361044, 'message_signature': 'c2d0318b052dd95a22a9b9bdca4e68cb63ef67e3054856ffa82b418494d470fc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:17.021154', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '221f0ab0-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.673361044, 'message_signature': 'd5d505d221304d42549ea2c58b65d6cca579498df68dc4b99de0f31fd9447a61'}]}, 'timestamp': '2025-10-02 12:27:17.021685', '_unique_id': '30d11171f6514463808bcea7c5a6c582'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.023 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75a65ca4-dd65-4c44-a83b-db8c0baae3bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:17.023597', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '221f62e4-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': '6f107fe5ebf18159f8632c9da9932d3bba5971319ac3d800f1f889b875283406'}]}, 'timestamp': '2025-10-02 12:27:17.023988', '_unique_id': 'd7e27096bca64215b2386344351af37e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.024 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.025 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa64fa42-794f-40b3-8b5f-b45fb9bf580f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'instance-0000007c-de7f4178-00ba-409b-81ad-f6096e9ed144-tap23e93cfb-aa', 'timestamp': '2025-10-02T12:27:17.025783', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'tap23e93cfb-aa', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:a5:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23e93cfb-aa'}, 'message_id': '221fb47e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.602896832, 'message_signature': 'd2b3ab1347ecbf37298aac4dd340580a2edd5d8763b78d5fd1fcc0bce764d628'}]}, 'timestamp': '2025-10-02 12:27:17.026042', '_unique_id': 'c795b24d3c1242c4a7c2273879819016'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.026 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-934202400>]
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.027 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.027 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9783399-3670-48d7-975e-b0f59b1eeef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:17.027530', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '221ff88a-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '76966228f781d73ec91d0f00e169fc1c067580dbf73701f9d4e232f5afa965de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:17.027530', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222003ac-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '630e655010fb9c8eb6402324dd7b2aab592cf67933053247dfa5dfab87ae64fe'}]}, 'timestamp': '2025-10-02 12:27:17.028079', '_unique_id': 'c3419410bb454cf29716e31ddfe5c3b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.029 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.029 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.029 12 DEBUG ceilometer.compute.pollsters [-] de7f4178-00ba-409b-81ad-f6096e9ed144/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aa6cc17-0695-4786-b774-9a412f8a7231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-vda', 'timestamp': '2025-10-02T12:27:17.029310', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22203e80-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '7d7b6da55832a3249b10bee1e421dfcaf278fbc4adffd97c0fd51948ca273652'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'abb9f220716e48d79dbe2f97622937c4', 'user_name': None, 'project_id': '88e90c16adec46069b539d4f1431ab4d', 'project_name': None, 'resource_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144-sda', 'timestamp': '2025-10-02T12:27:17.029310', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-934202400', 'name': 'instance-0000007c', 'instance_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'instance_type': 'm1.nano', 'host': '34440c0e10575aa29afe1ecdff502f5ea723d68c58362f7fa6c2e04b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22204786-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6030.61190561, 'message_signature': '4e07d32d3e3d9f99be1f89e321b18cf6cf04e828bd9de194cef08e6c2c9fced2'}]}, 'timestamp': '2025-10-02 12:27:17.029804', '_unique_id': '28f19bca783440d9b1c6f07d31a38b7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:27:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:27:17.030 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:27:17 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:27:17 np0005466012 podman[240011]: 2025-10-02 12:27:17.129114375 +0000 UTC m=+0.047499761 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:27:17 np0005466012 podman[240010]: 2025-10-02 12:27:17.131457539 +0000 UTC m=+0.053210597 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:27:17 np0005466012 nova_compute[192063]: 2025-10-02 12:27:17.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:27:19Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ca:a5:17 10.100.0.13
Oct  2 08:27:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:20.987 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:27:20 np0005466012 nova_compute[192063]: 2025-10-02 12:27:20.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:20.989 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:27:21 np0005466012 nova_compute[192063]: 2025-10-02 12:27:21.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:22 np0005466012 nova_compute[192063]: 2025-10-02 12:27:22.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:26 np0005466012 nova_compute[192063]: 2025-10-02 12:27:26.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:27 np0005466012 nova_compute[192063]: 2025-10-02 12:27:27.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:27 np0005466012 nova_compute[192063]: 2025-10-02 12:27:27.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:29 np0005466012 nova_compute[192063]: 2025-10-02 12:27:29.825 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:27:30.990 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:27:31 np0005466012 nova_compute[192063]: 2025-10-02 12:27:31.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:31 np0005466012 nova_compute[192063]: 2025-10-02 12:27:31.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:32 np0005466012 nova_compute[192063]: 2025-10-02 12:27:32.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:32 np0005466012 nova_compute[192063]: 2025-10-02 12:27:32.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:33 np0005466012 podman[240060]: 2025-10-02 12:27:33.154364033 +0000 UTC m=+0.056067507 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:27:33 np0005466012 podman[240061]: 2025-10-02 12:27:33.169598353 +0000 UTC m=+0.082544806 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:27:34 np0005466012 nova_compute[192063]: 2025-10-02 12:27:34.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:35 np0005466012 nova_compute[192063]: 2025-10-02 12:27:35.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:35 np0005466012 nova_compute[192063]: 2025-10-02 12:27:35.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:27:36 np0005466012 podman[240113]: 2025-10-02 12:27:36.145521411 +0000 UTC m=+0.053847545 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:27:36 np0005466012 podman[240112]: 2025-10-02 12:27:36.179504538 +0000 UTC m=+0.086962799 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:27:36 np0005466012 nova_compute[192063]: 2025-10-02 12:27:36.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:36 np0005466012 nova_compute[192063]: 2025-10-02 12:27:36.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:36 np0005466012 nova_compute[192063]: 2025-10-02 12:27:36.907 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:36 np0005466012 nova_compute[192063]: 2025-10-02 12:27:36.907 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:36 np0005466012 nova_compute[192063]: 2025-10-02 12:27:36.908 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:36 np0005466012 nova_compute[192063]: 2025-10-02 12:27:36.908 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.242 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.320 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.321 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.410 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.561 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.562 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5543MB free_disk=73.2171401977539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.562 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.563 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.932 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance de7f4178-00ba-409b-81ad-f6096e9ed144 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.933 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:27:37 np0005466012 nova_compute[192063]: 2025-10-02 12:27:37.933 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:27:38 np0005466012 nova_compute[192063]: 2025-10-02 12:27:38.006 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:27:38 np0005466012 nova_compute[192063]: 2025-10-02 12:27:38.237 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:27:39 np0005466012 nova_compute[192063]: 2025-10-02 12:27:39.159 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:27:39 np0005466012 nova_compute[192063]: 2025-10-02 12:27:39.159 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:27:41 np0005466012 nova_compute[192063]: 2025-10-02 12:27:41.161 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:41 np0005466012 nova_compute[192063]: 2025-10-02 12:27:41.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:42 np0005466012 nova_compute[192063]: 2025-10-02 12:27:42.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:42 np0005466012 nova_compute[192063]: 2025-10-02 12:27:42.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:27:42 np0005466012 nova_compute[192063]: 2025-10-02 12:27:42.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:27:42 np0005466012 nova_compute[192063]: 2025-10-02 12:27:42.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:27:43 np0005466012 nova_compute[192063]: 2025-10-02 12:27:43.756 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:27:43 np0005466012 nova_compute[192063]: 2025-10-02 12:27:43.757 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:27:43 np0005466012 nova_compute[192063]: 2025-10-02 12:27:43.758 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:27:43 np0005466012 nova_compute[192063]: 2025-10-02 12:27:43.759 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:27:45 np0005466012 podman[240162]: 2025-10-02 12:27:45.138496615 +0000 UTC m=+0.058655688 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm)
Oct  2 08:27:45 np0005466012 podman[240161]: 2025-10-02 12:27:45.139243206 +0000 UTC m=+0.059353807 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 08:27:46 np0005466012 nova_compute[192063]: 2025-10-02 12:27:46.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:47 np0005466012 nova_compute[192063]: 2025-10-02 12:27:47.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:48 np0005466012 podman[240205]: 2025-10-02 12:27:48.135345831 +0000 UTC m=+0.052684693 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:27:48 np0005466012 podman[240204]: 2025-10-02 12:27:48.135390282 +0000 UTC m=+0.055193002 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:27:49 np0005466012 nova_compute[192063]: 2025-10-02 12:27:49.621 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [{"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:27:49 np0005466012 nova_compute[192063]: 2025-10-02 12:27:49.758 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-de7f4178-00ba-409b-81ad-f6096e9ed144" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:27:49 np0005466012 nova_compute[192063]: 2025-10-02 12:27:49.759 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:27:51 np0005466012 nova_compute[192063]: 2025-10-02 12:27:51.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:52 np0005466012 nova_compute[192063]: 2025-10-02 12:27:52.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:56 np0005466012 nova_compute[192063]: 2025-10-02 12:27:56.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:27:57 np0005466012 nova_compute[192063]: 2025-10-02 12:27:57.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:00 np0005466012 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  2 08:28:01 np0005466012 nova_compute[192063]: 2025-10-02 12:28:01.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:02.142 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:02.142 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:02.144 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:02 np0005466012 nova_compute[192063]: 2025-10-02 12:28:02.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:04 np0005466012 podman[240250]: 2025-10-02 12:28:04.146841999 +0000 UTC m=+0.056033625 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:28:04 np0005466012 podman[240251]: 2025-10-02 12:28:04.178670816 +0000 UTC m=+0.081224180 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 08:28:06 np0005466012 nova_compute[192063]: 2025-10-02 12:28:06.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:07 np0005466012 podman[240299]: 2025-10-02 12:28:07.135406427 +0000 UTC m=+0.054874554 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:28:07 np0005466012 podman[240300]: 2025-10-02 12:28:07.136019353 +0000 UTC m=+0.050842482 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:28:07 np0005466012 nova_compute[192063]: 2025-10-02 12:28:07.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:11 np0005466012 nova_compute[192063]: 2025-10-02 12:28:11.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:12 np0005466012 nova_compute[192063]: 2025-10-02 12:28:12.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:15.441 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:15.442 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:28:15 np0005466012 nova_compute[192063]: 2025-10-02 12:28:15.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:16 np0005466012 podman[240338]: 2025-10-02 12:28:16.168741011 +0000 UTC m=+0.078086653 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public)
Oct  2 08:28:16 np0005466012 podman[240337]: 2025-10-02 12:28:16.179294003 +0000 UTC m=+0.094697592 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct  2 08:28:16 np0005466012 nova_compute[192063]: 2025-10-02 12:28:16.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:17 np0005466012 nova_compute[192063]: 2025-10-02 12:28:17.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:19 np0005466012 podman[240392]: 2025-10-02 12:28:19.132543616 +0000 UTC m=+0.045335301 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:28:19 np0005466012 podman[240391]: 2025-10-02 12:28:19.135624271 +0000 UTC m=+0.053000483 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:28:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:20.445 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:21 np0005466012 nova_compute[192063]: 2025-10-02 12:28:21.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:22 np0005466012 nova_compute[192063]: 2025-10-02 12:28:22.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:26 np0005466012 nova_compute[192063]: 2025-10-02 12:28:26.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:27 np0005466012 nova_compute[192063]: 2025-10-02 12:28:27.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.600 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.600 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.601 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.602 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.602 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.624 2 INFO nova.compute.manager [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Terminating instance#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.645 2 DEBUG nova.compute.manager [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:28:28 np0005466012 kernel: tap23e93cfb-aa (unregistering): left promiscuous mode
Oct  2 08:28:28 np0005466012 NetworkManager[51207]: <info>  [1759408108.7180] device (tap23e93cfb-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:28:28Z|00490|binding|INFO|Releasing lport 23e93cfb-aa99-4427-8af6-c199677a54ec from this chassis (sb_readonly=0)
Oct  2 08:28:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:28:28Z|00491|binding|INFO|Setting lport 23e93cfb-aa99-4427-8af6-c199677a54ec down in Southbound
Oct  2 08:28:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:28:28Z|00492|binding|INFO|Removing iface tap23e93cfb-aa ovn-installed in OVS
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:28.757 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:a5:17 10.100.0.13'], port_security=['fa:16:3e:ca:a5:17 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de7f4178-00ba-409b-81ad-f6096e9ed144', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88e90c16adec46069b539d4f1431ab4d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ed50fd5d-92ed-497e-8f4f-4653533c5a19', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53026845-594b-430c-a1e8-d879cf008d46, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=23e93cfb-aa99-4427-8af6-c199677a54ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:28:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:28.759 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 23e93cfb-aa99-4427-8af6-c199677a54ec in datapath aa4ebb90-ef5e-4974-a53d-2aabd696731a unbound from our chassis#033[00m
Oct  2 08:28:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:28.762 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa4ebb90-ef5e-4974-a53d-2aabd696731a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:28:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:28.763 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8e6bb6-e6e8-4204-865b-5fd6c2af95e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:28.764 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a namespace which is not needed anymore#033[00m
Oct  2 08:28:28 np0005466012 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Oct  2 08:28:28 np0005466012 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007c.scope: Consumed 15.910s CPU time.
Oct  2 08:28:28 np0005466012 systemd-machined[152114]: Machine qemu-59-instance-0000007c terminated.
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.931 2 INFO nova.virt.libvirt.driver [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Instance destroyed successfully.#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.932 2 DEBUG nova.objects.instance [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lazy-loading 'resources' on Instance uuid de7f4178-00ba-409b-81ad-f6096e9ed144 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:28 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [NOTICE]   (239951) : haproxy version is 2.8.14-c23fe91
Oct  2 08:28:28 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [NOTICE]   (239951) : path to executable is /usr/sbin/haproxy
Oct  2 08:28:28 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [WARNING]  (239951) : Exiting Master process...
Oct  2 08:28:28 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [ALERT]    (239951) : Current worker (239959) exited with code 143 (Terminated)
Oct  2 08:28:28 np0005466012 neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a[239924]: [WARNING]  (239951) : All workers exited. Exiting... (0)
Oct  2 08:28:28 np0005466012 systemd[1]: libpod-f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c.scope: Deactivated successfully.
Oct  2 08:28:28 np0005466012 podman[240460]: 2025-10-02 12:28:28.955309487 +0000 UTC m=+0.076150652 container died f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.968 2 DEBUG nova.virt.libvirt.vif [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-934202400',display_name='tempest-ServerStableDeviceRescueTest-server-934202400',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-934202400',id=124,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:26:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88e90c16adec46069b539d4f1431ab4d',ramdisk_id='',reservation_id='r-hf1wm990',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-232864240',owner_user_name='tempest-ServerStableDeviceRescueTest-232864240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:27:05Z,user_data=None,user_id='abb9f220716e48d79dbe2f97622937c4',uuid=de7f4178-00ba-409b-81ad-f6096e9ed144,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.969 2 DEBUG nova.network.os_vif_util [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converting VIF {"id": "23e93cfb-aa99-4427-8af6-c199677a54ec", "address": "fa:16:3e:ca:a5:17", "network": {"id": "aa4ebb90-ef5e-4974-a53d-2aabd696731a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-483132528-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88e90c16adec46069b539d4f1431ab4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23e93cfb-aa", "ovs_interfaceid": "23e93cfb-aa99-4427-8af6-c199677a54ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.971 2 DEBUG nova.network.os_vif_util [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.971 2 DEBUG os_vif [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.975 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23e93cfb-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.985 2 INFO os_vif [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:a5:17,bridge_name='br-int',has_traffic_filtering=True,id=23e93cfb-aa99-4427-8af6-c199677a54ec,network=Network(aa4ebb90-ef5e-4974-a53d-2aabd696731a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23e93cfb-aa')#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.986 2 INFO nova.virt.libvirt.driver [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Deleting instance files /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144_del#033[00m
Oct  2 08:28:28 np0005466012 nova_compute[192063]: 2025-10-02 12:28:28.989 2 INFO nova.virt.libvirt.driver [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Deletion of /var/lib/nova/instances/de7f4178-00ba-409b-81ad-f6096e9ed144_del complete#033[00m
Oct  2 08:28:28 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:28:28 np0005466012 systemd[1]: var-lib-containers-storage-overlay-ce9492ea39d6991d47069935cca738bf884288ee8431e6232f59649ef975e0e0-merged.mount: Deactivated successfully.
Oct  2 08:28:29 np0005466012 podman[240460]: 2025-10-02 12:28:29.005366567 +0000 UTC m=+0.126207772 container cleanup f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:28:29 np0005466012 systemd[1]: libpod-conmon-f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c.scope: Deactivated successfully.
Oct  2 08:28:29 np0005466012 podman[240506]: 2025-10-02 12:28:29.102648438 +0000 UTC m=+0.058318259 container remove f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.110 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[509c3bc9-7ca2-4912-9bf6-a6710e0c2a53]: (4, ('Thu Oct  2 12:28:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c)\nf82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c\nThu Oct  2 12:28:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a (f82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c)\nf82a17d9bd65508cbf7706a6d15373cf9feea9e840595402b673d5a4edbeb12c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.112 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c074a0a8-476e-4a25-97e6-6f086ca1d177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.113 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa4ebb90-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:29 np0005466012 kernel: tapaa4ebb90-e0: left promiscuous mode
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.123 2 INFO nova.compute.manager [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.124 2 DEBUG oslo.service.loopingcall [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.124 2 DEBUG nova.compute.manager [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.124 2 DEBUG nova.network.neutron [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.131 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3246f643-8f0a-471e-bcc0-5f7cb9925673]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.157 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ae40de27-04bc-4c0b-856d-d4cf043f8a1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.159 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4a144c3e-4f41-4fa7-8eb3-73c6c3495983]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.175 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[572d0a2a-759a-4bdf-bcd8-87be40f9d990]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601864, 'reachable_time': 39970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240522, 'error': None, 'target': 'ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 systemd[1]: run-netns-ovnmeta\x2daa4ebb90\x2def5e\x2d4974\x2da53d\x2d2aabd696731a.mount: Deactivated successfully.
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.178 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aa4ebb90-ef5e-4974-a53d-2aabd696731a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:28:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:28:29.178 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[45aadf73-2bec-459a-9392-1e532315a2f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.445 2 DEBUG nova.compute.manager [req-6b5edcc7-d16f-49cc-9385-15e1d21112bf req-7e438bc4-a549-4731-a704-dfb458dc8b2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.446 2 DEBUG oslo_concurrency.lockutils [req-6b5edcc7-d16f-49cc-9385-15e1d21112bf req-7e438bc4-a549-4731-a704-dfb458dc8b2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.446 2 DEBUG oslo_concurrency.lockutils [req-6b5edcc7-d16f-49cc-9385-15e1d21112bf req-7e438bc4-a549-4731-a704-dfb458dc8b2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.446 2 DEBUG oslo_concurrency.lockutils [req-6b5edcc7-d16f-49cc-9385-15e1d21112bf req-7e438bc4-a549-4731-a704-dfb458dc8b2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.447 2 DEBUG nova.compute.manager [req-6b5edcc7-d16f-49cc-9385-15e1d21112bf req-7e438bc4-a549-4731-a704-dfb458dc8b2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:29 np0005466012 nova_compute[192063]: 2025-10-02 12:28:29.447 2 DEBUG nova.compute.manager [req-6b5edcc7-d16f-49cc-9385-15e1d21112bf req-7e438bc4-a549-4731-a704-dfb458dc8b2f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-unplugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:28:30 np0005466012 nova_compute[192063]: 2025-10-02 12:28:30.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:31 np0005466012 nova_compute[192063]: 2025-10-02 12:28:31.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.045 2 DEBUG nova.compute.manager [req-2b428de2-60f7-4b3f-9466-e49c4a131eef req-716fff08-c653-4c7c-8ce5-2744687da469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.045 2 DEBUG oslo_concurrency.lockutils [req-2b428de2-60f7-4b3f-9466-e49c4a131eef req-716fff08-c653-4c7c-8ce5-2744687da469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.046 2 DEBUG oslo_concurrency.lockutils [req-2b428de2-60f7-4b3f-9466-e49c4a131eef req-716fff08-c653-4c7c-8ce5-2744687da469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.046 2 DEBUG oslo_concurrency.lockutils [req-2b428de2-60f7-4b3f-9466-e49c4a131eef req-716fff08-c653-4c7c-8ce5-2744687da469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.047 2 DEBUG nova.compute.manager [req-2b428de2-60f7-4b3f-9466-e49c4a131eef req-716fff08-c653-4c7c-8ce5-2744687da469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] No waiting events found dispatching network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.047 2 WARNING nova.compute.manager [req-2b428de2-60f7-4b3f-9466-e49c4a131eef req-716fff08-c653-4c7c-8ce5-2744687da469 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received unexpected event network-vif-plugged-23e93cfb-aa99-4427-8af6-c199677a54ec for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:28:32 np0005466012 nova_compute[192063]: 2025-10-02 12:28:32.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.015 2 DEBUG nova.network.neutron [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.063 2 INFO nova.compute.manager [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Took 3.94 seconds to deallocate network for instance.#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.155 2 DEBUG nova.compute.manager [req-7a62da7a-c3e6-4b65-9069-13a967c03bb6 req-a33e6397-8cc2-4182-9ee1-1419c1289240 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Received event network-vif-deleted-23e93cfb-aa99-4427-8af6-c199677a54ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.182 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.182 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.271 2 DEBUG nova.compute.provider_tree [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.322 2 DEBUG nova.scheduler.client.report [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.366 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.414 2 INFO nova.scheduler.client.report [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Deleted allocations for instance de7f4178-00ba-409b-81ad-f6096e9ed144#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.624 2 DEBUG oslo_concurrency.lockutils [None req-edf08da8-50fd-4c5b-a946-afe70f6ca032 abb9f220716e48d79dbe2f97622937c4 88e90c16adec46069b539d4f1431ab4d - - default default] Lock "de7f4178-00ba-409b-81ad-f6096e9ed144" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:33 np0005466012 nova_compute[192063]: 2025-10-02 12:28:33.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:34 np0005466012 nova_compute[192063]: 2025-10-02 12:28:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:35 np0005466012 podman[240523]: 2025-10-02 12:28:35.147416897 +0000 UTC m=+0.059440250 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:28:35 np0005466012 podman[240524]: 2025-10-02 12:28:35.181539457 +0000 UTC m=+0.091746900 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  2 08:28:35 np0005466012 nova_compute[192063]: 2025-10-02 12:28:35.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:37 np0005466012 nova_compute[192063]: 2025-10-02 12:28:37.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:37 np0005466012 nova_compute[192063]: 2025-10-02 12:28:37.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:37 np0005466012 nova_compute[192063]: 2025-10-02 12:28:37.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:37 np0005466012 nova_compute[192063]: 2025-10-02 12:28:37.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:28:38 np0005466012 podman[240575]: 2025-10-02 12:28:38.148968411 +0000 UTC m=+0.065188238 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:28:38 np0005466012 podman[240574]: 2025-10-02 12:28:38.152008196 +0000 UTC m=+0.072342426 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.846 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.848 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:28:38 np0005466012 nova_compute[192063]: 2025-10-02 12:28:38.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.070 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.071 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5725MB free_disk=73.24610137939453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.072 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.072 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.173 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.173 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.209 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.230 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.259 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:28:39 np0005466012 nova_compute[192063]: 2025-10-02 12:28:39.260 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:42 np0005466012 nova_compute[192063]: 2025-10-02 12:28:42.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:43 np0005466012 nova_compute[192063]: 2025-10-02 12:28:43.929 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408108.928098, de7f4178-00ba-409b-81ad-f6096e9ed144 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:28:43 np0005466012 nova_compute[192063]: 2025-10-02 12:28:43.929 2 INFO nova.compute.manager [-] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:28:43 np0005466012 nova_compute[192063]: 2025-10-02 12:28:43.954 2 DEBUG nova.compute.manager [None req-cd3a91ab-017d-41ab-ae6b-0a8834caea43 - - - - - -] [instance: de7f4178-00ba-409b-81ad-f6096e9ed144] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:28:43 np0005466012 nova_compute[192063]: 2025-10-02 12:28:43.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:44 np0005466012 nova_compute[192063]: 2025-10-02 12:28:44.260 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:28:44 np0005466012 nova_compute[192063]: 2025-10-02 12:28:44.261 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:28:44 np0005466012 nova_compute[192063]: 2025-10-02 12:28:44.261 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:28:44 np0005466012 nova_compute[192063]: 2025-10-02 12:28:44.300 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:28:46 np0005466012 nova_compute[192063]: 2025-10-02 12:28:46.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:47 np0005466012 podman[240614]: 2025-10-02 12:28:47.165905595 +0000 UTC m=+0.072655934 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:28:47 np0005466012 podman[240615]: 2025-10-02 12:28:47.178730319 +0000 UTC m=+0.080514610 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:28:47 np0005466012 nova_compute[192063]: 2025-10-02 12:28:47.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.393 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.393 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.447 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.904 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.905 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.912 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.912 2 INFO nova.compute.claims [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:28:48 np0005466012 nova_compute[192063]: 2025-10-02 12:28:48.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:49 np0005466012 nova_compute[192063]: 2025-10-02 12:28:49.279 2 DEBUG nova.compute.provider_tree [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:28:49 np0005466012 nova_compute[192063]: 2025-10-02 12:28:49.463 2 DEBUG nova.scheduler.client.report [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:28:49 np0005466012 nova_compute[192063]: 2025-10-02 12:28:49.738 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:49 np0005466012 nova_compute[192063]: 2025-10-02 12:28:49.739 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:28:50 np0005466012 nova_compute[192063]: 2025-10-02 12:28:50.158 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:28:50 np0005466012 nova_compute[192063]: 2025-10-02 12:28:50.159 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:28:50 np0005466012 podman[240655]: 2025-10-02 12:28:50.17903395 +0000 UTC m=+0.093520840 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:28:50 np0005466012 podman[240656]: 2025-10-02 12:28:50.181345704 +0000 UTC m=+0.082541687 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:28:50 np0005466012 nova_compute[192063]: 2025-10-02 12:28:50.267 2 INFO nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:28:50 np0005466012 nova_compute[192063]: 2025-10-02 12:28:50.648 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:28:50 np0005466012 nova_compute[192063]: 2025-10-02 12:28:50.886 2 DEBUG nova.policy [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.571 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.572 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.573 2 INFO nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Creating image(s)#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.573 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.574 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.574 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.587 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.642 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.643 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.644 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.659 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.735 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.736 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.890 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk 1073741824" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.891 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.891 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.948 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.950 2 DEBUG nova.virt.disk.api [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:28:51 np0005466012 nova_compute[192063]: 2025-10-02 12:28:51.950 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.005 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.007 2 DEBUG nova.virt.disk.api [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.007 2 DEBUG nova.objects.instance [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 619c3e72-5be3-417a-8ae9-25ef4b63a50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.146 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.146 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Ensure instance console log exists: /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.147 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.147 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.148 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:28:52 np0005466012 nova_compute[192063]: 2025-10-02 12:28:52.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:53 np0005466012 nova_compute[192063]: 2025-10-02 12:28:53.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:54 np0005466012 nova_compute[192063]: 2025-10-02 12:28:54.051 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Successfully created port: ff0fff12-a049-4dbc-b470-6f1a5654244c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:28:56 np0005466012 nova_compute[192063]: 2025-10-02 12:28:56.444 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Successfully created port: 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:28:57 np0005466012 nova_compute[192063]: 2025-10-02 12:28:57.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:58 np0005466012 nova_compute[192063]: 2025-10-02 12:28:58.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.065 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Successfully updated port: ff0fff12-a049-4dbc-b470-6f1a5654244c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.244 2 DEBUG nova.compute.manager [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-changed-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.245 2 DEBUG nova.compute.manager [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing instance network info cache due to event network-changed-ff0fff12-a049-4dbc-b470-6f1a5654244c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.245 2 DEBUG oslo_concurrency.lockutils [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.245 2 DEBUG oslo_concurrency.lockutils [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.245 2 DEBUG nova.network.neutron [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing network info cache for port ff0fff12-a049-4dbc-b470-6f1a5654244c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:28:59 np0005466012 nova_compute[192063]: 2025-10-02 12:28:59.958 2 DEBUG nova.network.neutron [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:29:00 np0005466012 nova_compute[192063]: 2025-10-02 12:29:00.459 2 DEBUG nova.network.neutron [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:00 np0005466012 nova_compute[192063]: 2025-10-02 12:29:00.518 2 DEBUG oslo_concurrency.lockutils [req-9849cf45-d040-4856-bb33-df41b20a6989 req-789d4bd2-7148-48c1-b491-5f0e0cdaf68a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:02.143 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:02.144 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:02.144 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.560 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Successfully updated port: 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.589 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.589 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.590 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.912 2 DEBUG nova.compute.manager [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-changed-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.913 2 DEBUG nova.compute.manager [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing instance network info cache due to event network-changed-7a540c18-fe1e-41b7-b421-d10a6e2d3f73. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:02 np0005466012 nova_compute[192063]: 2025-10-02 12:29:02.913 2 DEBUG oslo_concurrency.lockutils [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:03 np0005466012 nova_compute[192063]: 2025-10-02 12:29:03.737 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:29:03 np0005466012 nova_compute[192063]: 2025-10-02 12:29:03.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:06 np0005466012 podman[240715]: 2025-10-02 12:29:06.144193128 +0000 UTC m=+0.065749844 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:29:06 np0005466012 podman[240716]: 2025-10-02 12:29:06.207536124 +0000 UTC m=+0.125911942 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.667 2 DEBUG nova.network.neutron [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updating instance_info_cache with network_info: [{"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.954 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.955 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Instance network_info: |[{"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.956 2 DEBUG oslo_concurrency.lockutils [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.957 2 DEBUG nova.network.neutron [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing network info cache for port 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.964 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Start _get_guest_xml network_info=[{"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.971 2 WARNING nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.976 2 DEBUG nova.virt.libvirt.host [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.977 2 DEBUG nova.virt.libvirt.host [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.983 2 DEBUG nova.virt.libvirt.host [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.984 2 DEBUG nova.virt.libvirt.host [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.986 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.987 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.988 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.988 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.989 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.989 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.990 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.990 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.991 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.991 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.991 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.992 2 DEBUG nova.virt.hardware [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:29:07 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.999 2 DEBUG nova.virt.libvirt.vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-743984661',display_name='tempest-TestGettingAddress-server-743984661',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-743984661',id=129,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-bfop6dze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:50Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619c3e72-5be3-417a-8ae9-25ef4b63a50d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:07.999 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.001 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.003 2 DEBUG nova.virt.libvirt.vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-743984661',display_name='tempest-TestGettingAddress-server-743984661',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-743984661',id=129,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-bfop6dze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:50Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619c3e72-5be3-417a-8ae9-25ef4b63a50d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.004 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.005 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.007 2 DEBUG nova.objects.instance [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 619c3e72-5be3-417a-8ae9-25ef4b63a50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.159 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <uuid>619c3e72-5be3-417a-8ae9-25ef4b63a50d</uuid>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <name>instance-00000081</name>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestGettingAddress-server-743984661</nova:name>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:29:07</nova:creationTime>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:port uuid="ff0fff12-a049-4dbc-b470-6f1a5654244c">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        <nova:port uuid="7a540c18-fe1e-41b7-b421-d10a6e2d3f73">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fea5:6cea" ipVersion="6"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <entry name="serial">619c3e72-5be3-417a-8ae9-25ef4b63a50d</entry>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <entry name="uuid">619c3e72-5be3-417a-8ae9-25ef4b63a50d</entry>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.config"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:9f:b6:e1"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <target dev="tapff0fff12-a0"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:a5:6c:ea"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <target dev="tap7a540c18-fe"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/console.log" append="off"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:29:08 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:29:08 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:29:08 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:29:08 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.161 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Preparing to wait for external event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.161 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.161 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.162 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.162 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Preparing to wait for external event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.163 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.163 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.164 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.165 2 DEBUG nova.virt.libvirt.vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-743984661',display_name='tempest-TestGettingAddress-server-743984661',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-743984661',id=129,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-bfop6dze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:50Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619c3e72-5be3-417a-8ae9-25ef4b63a50d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.166 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.167 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.168 2 DEBUG os_vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.170 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.171 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff0fff12-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff0fff12-a0, col_values=(('external_ids', {'iface-id': 'ff0fff12-a049-4dbc-b470-6f1a5654244c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:b6:e1', 'vm-uuid': '619c3e72-5be3-417a-8ae9-25ef4b63a50d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 NetworkManager[51207]: <info>  [1759408148.1822] manager: (tapff0fff12-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.190 2 INFO os_vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0')#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.191 2 DEBUG nova.virt.libvirt.vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-743984661',display_name='tempest-TestGettingAddress-server-743984661',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-743984661',id=129,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-bfop6dze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:28:50Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619c3e72-5be3-417a-8ae9-25ef4b63a50d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.192 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.193 2 DEBUG nova.network.os_vif_util [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.193 2 DEBUG os_vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.194 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.195 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.199 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a540c18-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a540c18-fe, col_values=(('external_ids', {'iface-id': '7a540c18-fe1e-41b7-b421-d10a6e2d3f73', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:6c:ea', 'vm-uuid': '619c3e72-5be3-417a-8ae9-25ef4b63a50d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 NetworkManager[51207]: <info>  [1759408148.2036] manager: (tap7a540c18-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.211 2 INFO os_vif [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe')#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.379 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.380 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.380 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:9f:b6:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.380 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:a5:6c:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:29:08 np0005466012 nova_compute[192063]: 2025-10-02 12:29:08.381 2 INFO nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Using config drive#033[00m
Oct  2 08:29:09 np0005466012 podman[240765]: 2025-10-02 12:29:09.171098862 +0000 UTC m=+0.077248010 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  2 08:29:09 np0005466012 podman[240764]: 2025-10-02 12:29:09.181120678 +0000 UTC m=+0.094736483 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.231 2 INFO nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Creating config drive at /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.config#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.238 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp98iu9pxz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.372 2 DEBUG oslo_concurrency.processutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp98iu9pxz" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:09 np0005466012 kernel: tapff0fff12-a0: entered promiscuous mode
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.4501] manager: (tapff0fff12-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00493|binding|INFO|Claiming lport ff0fff12-a049-4dbc-b470-6f1a5654244c for this chassis.
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00494|binding|INFO|ff0fff12-a049-4dbc-b470-6f1a5654244c: Claiming fa:16:3e:9f:b6:e1 10.100.0.4
Oct  2 08:29:09 np0005466012 kernel: tap7a540c18-fe: entered promiscuous mode
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.4667] manager: (tap7a540c18-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.4709] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.4723] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Oct  2 08:29:09 np0005466012 systemd-udevd[240824]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:09 np0005466012 systemd-udevd[240823]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.4998] device (tap7a540c18-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.5019] device (tapff0fff12-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.5032] device (tap7a540c18-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.5040] device (tapff0fff12-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:09 np0005466012 systemd-machined[152114]: New machine qemu-60-instance-00000081.
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.556 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:b6:e1 10.100.0.4'], port_security=['fa:16:3e:9f:b6:e1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a731a9f5-9e55-440a-a95e-a9a819598de7, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=ff0fff12-a049-4dbc-b470-6f1a5654244c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.557 103246 INFO neutron.agent.ovn.metadata.agent [-] Port ff0fff12-a049-4dbc-b470-6f1a5654244c in datapath d68eafa7-b35f-4bd9-ba11-e28a73bc7849 bound to our chassis#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.558 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d68eafa7-b35f-4bd9-ba11-e28a73bc7849#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.573 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d85246ee-bead-427f-b284-f637aedd3ea7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.575 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd68eafa7-b1 in ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.579 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd68eafa7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.580 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7db11c9c-df2b-4480-bb40-b5e5ca49f828]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.580 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a0850857-50f8-410a-9994-1d30edb96030]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.592 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[18da5fcc-7a85-45d3-827f-793eb6dd0837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 systemd[1]: Started Virtual Machine qemu-60-instance-00000081.
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00495|binding|INFO|Claiming lport 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 for this chassis.
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00496|binding|INFO|7a540c18-fe1e-41b7-b421-d10a6e2d3f73: Claiming fa:16:3e:a5:6c:ea 2001:db8::f816:3eff:fea5:6cea
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00497|binding|INFO|Setting lport ff0fff12-a049-4dbc-b470-6f1a5654244c ovn-installed in OVS
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.620 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[17d941ef-0e2a-49c1-b3ca-494367eaded7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00498|binding|INFO|Setting lport 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 ovn-installed in OVS
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00499|binding|INFO|Setting lport 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 up in Southbound
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00500|binding|INFO|Setting lport ff0fff12-a049-4dbc-b470-6f1a5654244c up in Southbound
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.642 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:6c:ea 2001:db8::f816:3eff:fea5:6cea'], port_security=['fa:16:3e:a5:6c:ea 2001:db8::f816:3eff:fea5:6cea'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea5:6cea/64', 'neutron:device_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85092873-751b-414a-a9a1-112c2e61cb13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1cb8f94-a0b5-458e-a15a-45916ae4369f, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7a540c18-fe1e-41b7-b421-d10a6e2d3f73) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.652 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[df73c8a7-c1ea-4146-a45d-d0ffe5394f67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.661 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6e46269e-c64a-4cb8-8a4f-7c5a756aaa48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.6620] manager: (tapd68eafa7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.705 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f97d14-909c-4903-ac69-350a9e9e8db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.710 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[024f173b-caf1-415b-b57e-3eb65c8e0510]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.7413] device (tapd68eafa7-b0): carrier: link connected
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.749 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[aa597786-a676-45dd-be8d-0db2e2abbbf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.773 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a64302-bbaf-4a00-b0f6-9a04ee6ec626]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd68eafa7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:32:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614335, 'reachable_time': 27321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240860, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.799 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a6af818f-686d-4643-9236-80c5a11d2d4f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:32d2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 614335, 'tstamp': 614335}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240861, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.826 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[63ab9760-5e2a-445f-9f2c-845dc9fceb1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd68eafa7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:32:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614335, 'reachable_time': 27321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240862, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.875 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[706df65d-6dbc-4dc6-a86e-19c7dd268775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.959 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c3cb4347-7a5f-446f-bf12-071080242191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.964 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd68eafa7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.965 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.966 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd68eafa7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 NetworkManager[51207]: <info>  [1759408149.9692] manager: (tapd68eafa7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Oct  2 08:29:09 np0005466012 kernel: tapd68eafa7-b0: entered promiscuous mode
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.973 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd68eafa7-b0, col_values=(('external_ids', {'iface-id': '41abd6d0-501e-436d-9123-d8936335a0b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:09Z|00501|binding|INFO|Releasing lport 41abd6d0-501e-436d-9123-d8936335a0b5 from this chassis (sb_readonly=0)
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.978 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.979 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6a87f8b8-09ca-4551-b091-538c65adbfd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.981 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-d68eafa7-b35f-4bd9-ba11-e28a73bc7849
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.pid.haproxy
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID d68eafa7-b35f-4bd9-ba11-e28a73bc7849
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:09.983 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'env', 'PROCESS_TAG=haproxy-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d68eafa7-b35f-4bd9-ba11-e28a73bc7849.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:09 np0005466012 nova_compute[192063]: 2025-10-02 12:29:09.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.116 2 DEBUG nova.compute.manager [req-a27bdb35-cc64-44c7-a3ec-4ebfcbea8c5a req-1e6bfc34-f181-4fbd-b344-cf594887b717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.116 2 DEBUG oslo_concurrency.lockutils [req-a27bdb35-cc64-44c7-a3ec-4ebfcbea8c5a req-1e6bfc34-f181-4fbd-b344-cf594887b717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.116 2 DEBUG oslo_concurrency.lockutils [req-a27bdb35-cc64-44c7-a3ec-4ebfcbea8c5a req-1e6bfc34-f181-4fbd-b344-cf594887b717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.117 2 DEBUG oslo_concurrency.lockutils [req-a27bdb35-cc64-44c7-a3ec-4ebfcbea8c5a req-1e6bfc34-f181-4fbd-b344-cf594887b717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.117 2 DEBUG nova.compute.manager [req-a27bdb35-cc64-44c7-a3ec-4ebfcbea8c5a req-1e6bfc34-f181-4fbd-b344-cf594887b717 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Processing event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.248 2 DEBUG nova.compute.manager [req-ee8cee0f-fce3-44de-9796-7b0c390065c1 req-a1e74193-af05-4e68-a32b-9b7cb162d76d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.249 2 DEBUG oslo_concurrency.lockutils [req-ee8cee0f-fce3-44de-9796-7b0c390065c1 req-a1e74193-af05-4e68-a32b-9b7cb162d76d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.249 2 DEBUG oslo_concurrency.lockutils [req-ee8cee0f-fce3-44de-9796-7b0c390065c1 req-a1e74193-af05-4e68-a32b-9b7cb162d76d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.250 2 DEBUG oslo_concurrency.lockutils [req-ee8cee0f-fce3-44de-9796-7b0c390065c1 req-a1e74193-af05-4e68-a32b-9b7cb162d76d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.250 2 DEBUG nova.compute.manager [req-ee8cee0f-fce3-44de-9796-7b0c390065c1 req-a1e74193-af05-4e68-a32b-9b7cb162d76d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Processing event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:29:10 np0005466012 podman[240892]: 2025-10-02 12:29:10.362418104 +0000 UTC m=+0.060644713 container create 4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:29:10 np0005466012 systemd[1]: Started libpod-conmon-4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86.scope.
Oct  2 08:29:10 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:29:10 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e60832e9edb6745632176569bbc436f8e8dd0cc9790dc7596f2fddd6acc50ca1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:10 np0005466012 podman[240892]: 2025-10-02 12:29:10.331544953 +0000 UTC m=+0.029771602 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:10 np0005466012 podman[240892]: 2025-10-02 12:29:10.437143543 +0000 UTC m=+0.135370182 container init 4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:10 np0005466012 podman[240892]: 2025-10-02 12:29:10.443315163 +0000 UTC m=+0.141541772 container start 4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:29:10 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [NOTICE]   (240918) : New worker (240920) forked
Oct  2 08:29:10 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [NOTICE]   (240918) : Loading success.
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.572 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 in datapath 85092873-751b-414a-a9a1-112c2e61cb13 unbound from our chassis#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.575 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85092873-751b-414a-a9a1-112c2e61cb13#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.590 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b483e811-69c1-4e8a-add2-7077665fec8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.592 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85092873-71 in ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.595 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85092873-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.596 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0d0769-a437-4394-9fe3-94557fe10e9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.598 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[73d3f7fb-ffa3-4b51-8c66-bb954128c37c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.614 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e0bdabea-71b7-4603-9603-fd303f79b3a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.630 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[96f66b43-51b8-4bc7-8700-1b71806b92c8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.674 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ec28983e-6d28-409e-ba53-b4e5d2789c73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 NetworkManager[51207]: <info>  [1759408150.6825] manager: (tap85092873-70): new Veth device (/org/freedesktop/NetworkManager/Devices/227)
Oct  2 08:29:10 np0005466012 systemd-udevd[240853]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.681 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[06e7a51f-a4ae-46de-be1e-73e6b7dcbb4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.724 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ae39d7-14cc-4d7b-9cdf-e7064fcb21fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.727 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5aeaf644-8aa3-49e9-86f8-b2b64bc3e251]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 NetworkManager[51207]: <info>  [1759408150.7664] device (tap85092873-70): carrier: link connected
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.771 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3025397e-2e20-4e7a-9142-9ad837390875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.799 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e3dd7403-23e5-4cc1-8ae0-8705be0acc06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85092873-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:05:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614437, 'reachable_time': 40747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240940, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.822 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6d895c-ca0d-47d5-b683-8f00a094f629]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:5f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 614437, 'tstamp': 614437}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240941, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.843 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7f27d0cd-c380-439b-b333-3f5de8432755]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85092873-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:05:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614437, 'reachable_time': 40747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240942, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.875 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6671be-5a6b-4c3f-acbb-dbb37010e57c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.920 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dae65d2f-5170-43f1-9228-04971a5667d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.922 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85092873-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.922 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.923 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85092873-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:10 np0005466012 NetworkManager[51207]: <info>  [1759408150.9251] manager: (tap85092873-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Oct  2 08:29:10 np0005466012 kernel: tap85092873-70: entered promiscuous mode
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.931 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.931 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85092873-70, col_values=(('external_ids', {'iface-id': '8baac968-8bfc-4d6e-93cc-be5861eaa459'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:10Z|00502|binding|INFO|Releasing lport 8baac968-8bfc-4d6e-93cc-be5861eaa459 from this chassis (sb_readonly=0)
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.932 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408150.930872, 619c3e72-5be3-417a-8ae9-25ef4b63a50d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.935 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.938 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85092873-751b-414a-a9a1-112c2e61cb13.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85092873-751b-414a-a9a1-112c2e61cb13.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.938 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[409d9e9f-c482-4b80-a953-797619eb0f7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.939 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-85092873-751b-414a-a9a1-112c2e61cb13
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/85092873-751b-414a-a9a1-112c2e61cb13.pid.haproxy
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 85092873-751b-414a-a9a1-112c2e61cb13
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:10.939 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'env', 'PROCESS_TAG=haproxy-85092873-751b-414a-a9a1-112c2e61cb13', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85092873-751b-414a-a9a1-112c2e61cb13.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.940 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.946 2 INFO nova.virt.libvirt.driver [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Instance spawned successfully.#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.947 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:29:10 np0005466012 nova_compute[192063]: 2025-10-02 12:29:10.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.014 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.019 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.073 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.074 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.074 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.074 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.075 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.075 2 DEBUG nova.virt.libvirt.driver [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.079 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.080 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408150.9325023, 619c3e72-5be3-417a-8ae9-25ef4b63a50d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.080 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.155 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.158 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408150.9376676, 619c3e72-5be3-417a-8ae9-25ef4b63a50d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.159 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.268 2 DEBUG nova.network.neutron [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updated VIF entry in instance network info cache for port 7a540c18-fe1e-41b7-b421-d10a6e2d3f73. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.269 2 DEBUG nova.network.neutron [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updating instance_info_cache with network_info: [{"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.280 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.285 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:11 np0005466012 podman[240972]: 2025-10-02 12:29:11.393254782 +0000 UTC m=+0.073226620 container create 71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.425 2 DEBUG oslo_concurrency.lockutils [req-464b67da-5781-4f5a-adac-3bfca6c1f54a req-9ddcd4e9-c633-4e60-94d7-d85ac627f6ab 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:11 np0005466012 podman[240972]: 2025-10-02 12:29:11.350093352 +0000 UTC m=+0.030065240 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:11 np0005466012 systemd[1]: Started libpod-conmon-71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963.scope.
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.468 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.486 2 INFO nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Took 19.91 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.486 2 DEBUG nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:11 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:29:11 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/174aafcc9d5d50d6757543c8314645826d04dc686ea822120ce859f4516ce5d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:11 np0005466012 podman[240972]: 2025-10-02 12:29:11.513662001 +0000 UTC m=+0.193633829 container init 71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:11 np0005466012 podman[240972]: 2025-10-02 12:29:11.525202999 +0000 UTC m=+0.205174807 container start 71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  2 08:29:11 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [NOTICE]   (240991) : New worker (240993) forked
Oct  2 08:29:11 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [NOTICE]   (240991) : Loading success.
Oct  2 08:29:11 np0005466012 nova_compute[192063]: 2025-10-02 12:29:11.950 2 INFO nova.compute.manager [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Took 23.20 seconds to build instance.#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.310 2 DEBUG oslo_concurrency.lockutils [None req-caed997e-95d4-4bd8-9d83-a63a29bcc237 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.340 2 DEBUG nova.compute.manager [req-a11ba80f-44c5-4002-8c13-df766f3d3e8d req-e639c0b2-6551-4399-ac27-7282b654018e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.340 2 DEBUG oslo_concurrency.lockutils [req-a11ba80f-44c5-4002-8c13-df766f3d3e8d req-e639c0b2-6551-4399-ac27-7282b654018e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.341 2 DEBUG oslo_concurrency.lockutils [req-a11ba80f-44c5-4002-8c13-df766f3d3e8d req-e639c0b2-6551-4399-ac27-7282b654018e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.341 2 DEBUG oslo_concurrency.lockutils [req-a11ba80f-44c5-4002-8c13-df766f3d3e8d req-e639c0b2-6551-4399-ac27-7282b654018e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.341 2 DEBUG nova.compute.manager [req-a11ba80f-44c5-4002-8c13-df766f3d3e8d req-e639c0b2-6551-4399-ac27-7282b654018e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] No waiting events found dispatching network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.341 2 WARNING nova.compute.manager [req-a11ba80f-44c5-4002-8c13-df766f3d3e8d req-e639c0b2-6551-4399-ac27-7282b654018e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received unexpected event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.452 2 DEBUG nova.compute.manager [req-7c651813-024f-432c-923f-63a9cecff486 req-a5d793e2-31cb-424a-b867-20471144108f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.453 2 DEBUG oslo_concurrency.lockutils [req-7c651813-024f-432c-923f-63a9cecff486 req-a5d793e2-31cb-424a-b867-20471144108f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.453 2 DEBUG oslo_concurrency.lockutils [req-7c651813-024f-432c-923f-63a9cecff486 req-a5d793e2-31cb-424a-b867-20471144108f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.454 2 DEBUG oslo_concurrency.lockutils [req-7c651813-024f-432c-923f-63a9cecff486 req-a5d793e2-31cb-424a-b867-20471144108f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.454 2 DEBUG nova.compute.manager [req-7c651813-024f-432c-923f-63a9cecff486 req-a5d793e2-31cb-424a-b867-20471144108f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] No waiting events found dispatching network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:12 np0005466012 nova_compute[192063]: 2025-10-02 12:29:12.455 2 WARNING nova.compute.manager [req-7c651813-024f-432c-923f-63a9cecff486 req-a5d793e2-31cb-424a-b867-20471144108f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received unexpected event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:13 np0005466012 nova_compute[192063]: 2025-10-02 12:29:13.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005466012 nova_compute[192063]: 2025-10-02 12:29:16.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.926 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'name': 'tempest-TestGettingAddress-server-743984661', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000081', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'hostId': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.944 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.945 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7260306-0814-4beb-b337-b208002ef2a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:16.927046', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '699a046c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '66d5b20f8c86e4285f30821748c789f3461cacfff24c9fff072b9e35aa611c4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:16.927046', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699a1204-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '8044f201b0ffc3a0534a9f8f79ec5316dd147987087729b620017261d217a36d'}]}, 'timestamp': '2025-10-02 12:29:16.946218', '_unique_id': '0639fb7aa5774941aad5c7b90ebc5224'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.947 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.951 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 619c3e72-5be3-417a-8ae9-25ef4b63a50d / tapff0fff12-a0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.952 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 619c3e72-5be3-417a-8ae9-25ef4b63a50d / tap7a540c18-fe inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.952 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.952 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b66d4f94-4498-400c-b7a8-57f6405884dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.948588', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699b0e5c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': 'a6b4ebcc111a10faced36c04122b2040429f97e97aeabb5e14832ccbcb7c6bcd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.948588', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699b1bae-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': 'd9a37b4697476c09b6e84172e7c85901cfcada0a320ded316cfe3cae993aa23a'}]}, 'timestamp': '2025-10-02 12:29:16.952998', '_unique_id': '3596e5582c6b479a91f5774718d3644f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.954 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.955 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.955 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43d6c1ee-83ff-4ddc-b683-ba5f52241534', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.955150', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699b7ba8-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': 'b250863701dc26a3eb37c596e4e443ac279cb4fbcf55c6f091d25194a8a3fe0e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.955150', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699b85f8-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '810c62f4df188bc54c8816c40dba09a81fa1292afa3e7927e802b4765f9cc9c1'}]}, 'timestamp': '2025-10-02 12:29:16.955717', '_unique_id': '9bd1f693e2724e1093ea38b13bb24be2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.956 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.957 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.957 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>]
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.957 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43d8e712-dbcc-462f-8067-a0163baf7be2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.957753', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699be0de-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '63573dc53fb003377e7349d6d14c30fa93ed522bdb30dc88004cd3656b4a09ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.957753', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699beac0-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '04e613ab6eaa8abf54a74fa801fa6b2b3bbb5ec2488dd9427d7c8ab759486643'}]}, 'timestamp': '2025-10-02 12:29:16.958280', '_unique_id': '02e6d7b2240d42d9a705b5decbd2b239'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.958 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.959 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd437195-6ee0-4d22-abd7-df225617cb57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.959859', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699c32fa-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '77ec9e4d7487857bbb300a2467a2c4401aca7eb7bbecf787dfb0416de869a8f0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.959859', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699c3c96-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '01127ade88c405110fedb1ddc1834806d9c9609cf0f70ec800a501c343e043a4'}]}, 'timestamp': '2025-10-02 12:29:16.960374', '_unique_id': '98b6b95e49fd45f7bc194cab90e24865'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.960 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.961 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.961 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e355f37-803b-4b59-9ab4-ed2870d7e7c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:16.961926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '699c8368-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '842dc93f2a8cbb6922fef763c2c6ff5c63203974e6da521f5eb04db257e91e06'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:16.961926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699c8db8-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': 'ef2b99ce3ecd7c253a75d7114175d4e85cf47ce9b0571bf46d7356a14ee1344f'}]}, 'timestamp': '2025-10-02 12:29:16.962446', '_unique_id': '3ce1bfa07b5b49de8bada2a35602d715'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.962 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.963 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.964 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '989ce424-659f-43d2-9f9c-10997055de93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.963942', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699cd322-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '0c491ee463d73fae186fd0597f47620ee740230f6342faa93957cac330685008'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.963942', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699cdf84-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': 'bde807c99eb84144a57f8f9699ebdda4d881c886c96fe859b65d86c4f585dfaa'}]}, 'timestamp': '2025-10-02 12:29:16.964587', '_unique_id': '7372a9ef6c884149990558ccb4971830'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.966 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.966 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25bacb3e-7e4e-4234-a7f5-61442c0b43be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:16.966110', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '699d275a-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '47bef82cb19a5660a8617a8f93f6b5b73e92fb60fd15c4eadeed2e846b13aae7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:16.966110', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699d30d8-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': 'e6ff4590c251754d92c10074776325e23b39acadd01af9125eb0e2ca1e095ea8'}]}, 'timestamp': '2025-10-02 12:29:16.966622', '_unique_id': '9479131ffbb64e0693398a56779c6e96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.967 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.968 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.968 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c237fdaf-0bf6-483b-99a0-8c663b630c30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.968321', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699d7eee-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': 'b003eebcb21ecb557675acfacf8ca7190cd5cd0a423746459c861d20ebbda44f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.968321', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699d8a1a-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '3f0f41d8ebf263e6e245630daa2c5bdfab9bde9bf52cf4ef6a44a839b1a10f2b'}]}, 'timestamp': '2025-10-02 12:29:16.968914', '_unique_id': '65b0fcf3af0a43b29d81992195994f60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.970 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.970 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24edfc61-6723-41f6-8702-748a7ea903ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:16.970394', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '699dce08-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '150ebc304bba6eaaad6efcf8cb786b29e5634dfae43efceac9c5de33dffa13c9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:16.970394', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '699dd84e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '50719beb76fafda28557e6802832e84bc9fd4098c0e886ad1bcc343fbb52599b'}]}, 'timestamp': '2025-10-02 12:29:16.970925', '_unique_id': '9380e9b544c24ae8bbf57be8c10c432b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.981 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.981 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e54ec607-3ef4-4516-b5cc-0abccb6f1e48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:16.972420', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '699f7bb8-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.648797817, 'message_signature': '798349ae8a3273363cf799009b310f1df1a8b5acbb33f56084c7b32c6cf111e0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:16.972420', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699f86b2-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.648797817, 'message_signature': 'b43e9651f70ef6743d64ad95bbb7692c449d078a13ec42cb6c1b54e8793382fc'}]}, 'timestamp': '2025-10-02 12:29:16.981930', '_unique_id': 'c923bd5e07d34260a3594111fae105bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.983 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.984 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>]
Oct  2 08:29:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:16.984 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.000 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.000 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 619c3e72-5be3-417a-8ae9-25ef4b63a50d: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.001 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.001 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>]
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.001 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.read.latency volume: 433068866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.002 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.read.latency volume: 2275522 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '860cb86c-f552-4167-839e-8802fd0a8e74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 433068866, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:17.001906', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a2a126-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '29dd581763be9260fa5f0379fcd785bc05d819f1d19e7be9f7d42aef1007085a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2275522, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:17.001906', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a2aeaa-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '44e49eb5b91a91b43774e54a6f0059b5eb65da656803583ddbd502f65c85e777'}]}, 'timestamp': '2025-10-02 12:29:17.002637', '_unique_id': 'a67bdc34aa8a424292c16476b9299e95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.004 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.005 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-743984661>]
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.005 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.005 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96378d95-a99e-4f04-965b-5b609c8f8d3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:17.005398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a32768-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': 'ab1ec7325698eade7b45089c116257146a2aeb2f1d13e36f024688ba13845ca5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:17.005398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a3355a-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '55580841673ee53f8fa9ab746f8485e8b07a6bbc8934861cbb1100dae8effe5f'}]}, 'timestamp': '2025-10-02 12:29:17.006103', '_unique_id': 'e0ba9e3c245d4b819ac4450764eb64f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.006 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.008 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.008 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69ac7198-d3e7-466a-84b2-9bb3a9807ca7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:17.008011', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '69a38ec4-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '1375f0bced8dac05f345a65ccdc65fd835eec2add724be869f7d64229391f20b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:17.008011', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '69a39c0c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '8e7bdda65c120e147924f5627adcb41539984f1dcb2d42b17974113562ee51bc'}]}, 'timestamp': '2025-10-02 12:29:17.008760', '_unique_id': 'e858b159ae1f46f1a653661a2341baee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.010 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.010 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/cpu volume: 5790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90cdec32-91f4-47fa-b45f-ec605ee11016', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5790000000, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'timestamp': '2025-10-02T12:29:17.010681', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69a3f72e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.67685027, 'message_signature': 'a302f9e2009c4c48331a940300da867eb06741a6ad64af65d3f52e9c368d2681'}]}, 'timestamp': '2025-10-02 12:29:17.011068', '_unique_id': '0c7bc500fb254af2be63e5d5186c6de2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.013 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.013 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '521fec50-8ffd-479f-894a-90a8d603cb10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:17.013126', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '69a4555c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '75cb70962666d472beb4733175736fd3d66311fa66ffebfce511072aedececc9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:17.013126', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '69a46286-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': 'c118f4be6568a51e40e5df8bb2057bc21fe92ceff053efb59cf53fffc4a838f5'}]}, 'timestamp': '2025-10-02 12:29:17.013839', '_unique_id': 'b7423ce7a4a149b5bb57c66ebd0cb08f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.014 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.015 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.016 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de3067b8-a292-4ad5-878d-78ae493eca69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tapff0fff12-a0', 'timestamp': '2025-10-02T12:29:17.015918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tapff0fff12-a0', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:b6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff0fff12-a0'}, 'message_id': '69a4c636-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '530d743958b5d679ccaa62bbe2495f7c601d491981ad644fb8e82e9951952463'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-00000081-619c3e72-5be3-417a-8ae9-25ef4b63a50d-tap7a540c18-fe', 'timestamp': '2025-10-02T12:29:17.015918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'tap7a540c18-fe', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:6c:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a540c18-fe'}, 'message_id': '69a4d3b0-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.62496791, 'message_signature': '553cdbedf5022379dc7e4d2402abc8edf1603f2a6b13e9e0d86366930d3c54cd'}]}, 'timestamp': '2025-10-02 12:29:17.016740', '_unique_id': 'a3515fd0716048fbaf208f889cdc536b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.018 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.019 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91f3ff4a-ced1-4e40-9790-eecfa3e271ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:17.018868', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a535c6-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.648797817, 'message_signature': '8dac805bdf7ff90caa1c67d6fef2a1a334f650230e11158b6b560fd952617c25'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:17.018868', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a5426e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.648797817, 'message_signature': '117f4a11b4b210501d20d67f6baa7f63ff5f0ab05d7ec6242c6ec9223c46ca31'}]}, 'timestamp': '2025-10-02 12:29:17.019542', '_unique_id': '234edf1854504a2889a2e4c6404aa95d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.021 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.021 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ec0ae99-7180-4652-8a3c-43489931b434', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:17.021521', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a59e1c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.648797817, 'message_signature': 'acbbb171018de8c007466e3423f0a18fb5a60ae00103e1a4e7e2d0c31324552e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:17.021521', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a5ab1e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.648797817, 'message_signature': '7e17d8a2665411d69e4416b0ca7669ff89918abd899101c83352471cac46f4df'}]}, 'timestamp': '2025-10-02 12:29:17.022223', '_unique_id': 'ab73db47dc354a66adf930d6c3c56d6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.024 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.024 12 DEBUG ceilometer.compute.pollsters [-] 619c3e72-5be3-417a-8ae9-25ef4b63a50d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd638b09d-b83a-4d8f-bbfc-cb3b5ded0c65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-vda', 'timestamp': '2025-10-02T12:29:17.024325', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a60af0-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': 'f301684c2f17d01b4a1cab116583addaeb4500119b1a72dec52b0e49ef787fb7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d-sda', 'timestamp': '2025-10-02T12:29:17.024325', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-743984661', 'name': 'instance-00000081', 'instance_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a618ba-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6150.603406025, 'message_signature': '384085c6126585e6b86c03a2178d2817aa31a20b69a0267a84a5fb4765e2c2f5'}]}, 'timestamp': '2025-10-02 12:29:17.025029', '_unique_id': 'dc29d425038243c4879a43411d9212af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:29:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:29:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:29:17 np0005466012 nova_compute[192063]: 2025-10-02 12:29:17.174 2 DEBUG nova.compute.manager [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-changed-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:17 np0005466012 nova_compute[192063]: 2025-10-02 12:29:17.175 2 DEBUG nova.compute.manager [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing instance network info cache due to event network-changed-ff0fff12-a049-4dbc-b470-6f1a5654244c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:17 np0005466012 nova_compute[192063]: 2025-10-02 12:29:17.175 2 DEBUG oslo_concurrency.lockutils [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:17 np0005466012 nova_compute[192063]: 2025-10-02 12:29:17.176 2 DEBUG oslo_concurrency.lockutils [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:17 np0005466012 nova_compute[192063]: 2025-10-02 12:29:17.176 2 DEBUG nova.network.neutron [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing network info cache for port ff0fff12-a049-4dbc-b470-6f1a5654244c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:17 np0005466012 nova_compute[192063]: 2025-10-02 12:29:17.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:18 np0005466012 podman[241002]: 2025-10-02 12:29:18.152096805 +0000 UTC m=+0.068418406 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:29:18 np0005466012 podman[241003]: 2025-10-02 12:29:18.17039906 +0000 UTC m=+0.077558799 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_id=edpm)
Oct  2 08:29:18 np0005466012 nova_compute[192063]: 2025-10-02 12:29:18.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:18 np0005466012 nova_compute[192063]: 2025-10-02 12:29:18.998 2 DEBUG nova.network.neutron [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updated VIF entry in instance network info cache for port ff0fff12-a049-4dbc-b470-6f1a5654244c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:19 np0005466012 nova_compute[192063]: 2025-10-02 12:29:19.000 2 DEBUG nova.network.neutron [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updating instance_info_cache with network_info: [{"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:19 np0005466012 nova_compute[192063]: 2025-10-02 12:29:19.024 2 DEBUG oslo_concurrency.lockutils [req-34fde0c7-030d-4f4c-bf0a-3eb33541ca74 req-97471af0-b84d-4975-9385-47314a9dd0bf 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.042 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.043 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.071 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.195 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.197 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.206 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.207 2 INFO nova.compute.claims [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:29:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:20.251 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:20.253 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.414 2 DEBUG nova.compute.provider_tree [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.548 2 DEBUG nova.scheduler.client.report [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.704 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:20 np0005466012 nova_compute[192063]: 2025-10-02 12:29:20.705 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:29:21 np0005466012 podman[241044]: 2025-10-02 12:29:21.163279806 +0000 UTC m=+0.060971182 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:29:21 np0005466012 podman[241043]: 2025-10-02 12:29:21.169647391 +0000 UTC m=+0.070193296 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:21 np0005466012 nova_compute[192063]: 2025-10-02 12:29:21.331 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:29:21 np0005466012 nova_compute[192063]: 2025-10-02 12:29:21.331 2 DEBUG nova.network.neutron [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:29:21 np0005466012 nova_compute[192063]: 2025-10-02 12:29:21.507 2 INFO nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:29:21 np0005466012 nova_compute[192063]: 2025-10-02 12:29:21.766 2 DEBUG nova.policy [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b8eff28d6954bb5b8e944809b021cc3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '77d1e2844ab74aec8a73a999db6939a9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:29:21 np0005466012 nova_compute[192063]: 2025-10-02 12:29:21.853 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.865 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.869 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.870 2 INFO nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Creating image(s)#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.871 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "/var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.872 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "/var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.873 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "/var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:22 np0005466012 nova_compute[192063]: 2025-10-02 12:29:22.901 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.005 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.008 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.009 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.033 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.118 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.120 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.514 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk 1073741824" returned: 0 in 0.394s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.516 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.517 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.605 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.607 2 DEBUG nova.virt.disk.api [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Checking if we can resize image /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.608 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.674 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.675 2 DEBUG nova.virt.disk.api [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Cannot resize image /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.676 2 DEBUG nova.objects.instance [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lazy-loading 'migration_context' on Instance uuid 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.775 2 DEBUG nova.network.neutron [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Successfully created port: 8763ebde-baaa-4e83-93c4-797a04c46f19 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.804 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.805 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Ensure instance console log exists: /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.805 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.805 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:23 np0005466012 nova_compute[192063]: 2025-10-02 12:29:23.806 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:25Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:b6:e1 10.100.0.4
Oct  2 08:29:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:25Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:b6:e1 10.100.0.4
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.178 2 DEBUG nova.network.neutron [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Successfully updated port: 8763ebde-baaa-4e83-93c4-797a04c46f19 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.329 2 DEBUG nova.compute.manager [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-changed-8763ebde-baaa-4e83-93c4-797a04c46f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.329 2 DEBUG nova.compute.manager [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Refreshing instance network info cache due to event network-changed-8763ebde-baaa-4e83-93c4-797a04c46f19. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.330 2 DEBUG oslo_concurrency.lockutils [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.330 2 DEBUG oslo_concurrency.lockutils [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.330 2 DEBUG nova.network.neutron [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Refreshing network info cache for port 8763ebde-baaa-4e83-93c4-797a04c46f19 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.333 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "refresh_cache-4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:26 np0005466012 nova_compute[192063]: 2025-10-02 12:29:26.595 2 DEBUG nova.network.neutron [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:29:27 np0005466012 nova_compute[192063]: 2025-10-02 12:29:27.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:27 np0005466012 nova_compute[192063]: 2025-10-02 12:29:27.700 2 DEBUG nova.network.neutron [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:27 np0005466012 nova_compute[192063]: 2025-10-02 12:29:27.753 2 DEBUG oslo_concurrency.lockutils [req-df746611-26ef-4692-bd8f-b18dff55b1d8 req-759160f8-2ee1-4778-b922-992d26a988b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:27 np0005466012 nova_compute[192063]: 2025-10-02 12:29:27.754 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquired lock "refresh_cache-4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:27 np0005466012 nova_compute[192063]: 2025-10-02 12:29:27.754 2 DEBUG nova.network.neutron [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:29:27 np0005466012 nova_compute[192063]: 2025-10-02 12:29:27.966 2 DEBUG nova.network.neutron [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:29:28 np0005466012 nova_compute[192063]: 2025-10-02 12:29:28.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.050 2 DEBUG nova.network.neutron [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Updating instance_info_cache with network_info: [{"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.080 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Releasing lock "refresh_cache-4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.080 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Instance network_info: |[{"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.082 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Start _get_guest_xml network_info=[{"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.087 2 WARNING nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.093 2 DEBUG nova.virt.libvirt.host [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.094 2 DEBUG nova.virt.libvirt.host [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.097 2 DEBUG nova.virt.libvirt.host [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.097 2 DEBUG nova.virt.libvirt.host [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.098 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.098 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.099 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.099 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.099 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.099 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.099 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.100 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.100 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.100 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.100 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.100 2 DEBUG nova.virt.hardware [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.103 2 DEBUG nova.virt.libvirt.vif [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:29:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1549769307',display_name='tempest-ServerAddressesTestJSON-server-1549769307',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1549769307',id=132,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77d1e2844ab74aec8a73a999db6939a9',ramdisk_id='',reservation_id='r-ubdhj4cw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-2127548414',owner_user_name='tempest-ServerAddressesTestJSON-2127548414-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:22Z,user_data=None,user_id='9b8eff28d6954bb5b8e944809b021cc3',uuid=4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.104 2 DEBUG nova.network.os_vif_util [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Converting VIF {"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.104 2 DEBUG nova.network.os_vif_util [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.105 2 DEBUG nova.objects.instance [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.158 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <uuid>4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7</uuid>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <name>instance-00000084</name>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerAddressesTestJSON-server-1549769307</nova:name>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:29:29</nova:creationTime>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:user uuid="9b8eff28d6954bb5b8e944809b021cc3">tempest-ServerAddressesTestJSON-2127548414-project-member</nova:user>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:project uuid="77d1e2844ab74aec8a73a999db6939a9">tempest-ServerAddressesTestJSON-2127548414</nova:project>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        <nova:port uuid="8763ebde-baaa-4e83-93c4-797a04c46f19">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <entry name="serial">4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7</entry>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <entry name="uuid">4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7</entry>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.config"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:18:1c:50"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <target dev="tap8763ebde-ba"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/console.log" append="off"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:29:29 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:29:29 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:29:29 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:29:29 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.160 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Preparing to wait for external event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.160 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.161 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.162 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.163 2 DEBUG nova.virt.libvirt.vif [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:29:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1549769307',display_name='tempest-ServerAddressesTestJSON-server-1549769307',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1549769307',id=132,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77d1e2844ab74aec8a73a999db6939a9',ramdisk_id='',reservation_id='r-ubdhj4cw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-2127548414',owner_user_name='tempest-ServerAddressesTestJSON-2127548414-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:29:22Z,user_data=None,user_id='9b8eff28d6954bb5b8e944809b021cc3',uuid=4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.164 2 DEBUG nova.network.os_vif_util [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Converting VIF {"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.165 2 DEBUG nova.network.os_vif_util [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.166 2 DEBUG os_vif [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.168 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.169 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8763ebde-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8763ebde-ba, col_values=(('external_ids', {'iface-id': '8763ebde-baaa-4e83-93c4-797a04c46f19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:1c:50', 'vm-uuid': '4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:29 np0005466012 NetworkManager[51207]: <info>  [1759408169.2273] manager: (tap8763ebde-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.236 2 INFO os_vif [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba')#033[00m
Oct  2 08:29:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:29.255 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.368 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.368 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.368 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] No VIF found with MAC fa:16:3e:18:1c:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.369 2 INFO nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Using config drive#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.915 2 INFO nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Creating config drive at /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.config#033[00m
Oct  2 08:29:29 np0005466012 nova_compute[192063]: 2025-10-02 12:29:29.924 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxo6k565v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.065 2 DEBUG oslo_concurrency.processutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxo6k565v" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:29:30 np0005466012 kernel: tap8763ebde-ba: entered promiscuous mode
Oct  2 08:29:30 np0005466012 NetworkManager[51207]: <info>  [1759408170.1399] manager: (tap8763ebde-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Oct  2 08:29:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:30Z|00503|binding|INFO|Claiming lport 8763ebde-baaa-4e83-93c4-797a04c46f19 for this chassis.
Oct  2 08:29:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:30Z|00504|binding|INFO|8763ebde-baaa-4e83-93c4-797a04c46f19: Claiming fa:16:3e:18:1c:50 10.100.0.13
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.154 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:1c:50 10.100.0.13'], port_security=['fa:16:3e:18:1c:50 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77d1e2844ab74aec8a73a999db6939a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd224aacb-8a54-4684-ab70-a87b79ba9529', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d24a4df6-9f71-4eba-adf5-cef03523ae79, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=8763ebde-baaa-4e83-93c4-797a04c46f19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.155 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 8763ebde-baaa-4e83-93c4-797a04c46f19 in datapath 1e988aee-a2b2-4397-94b7-24abbf801ed0 bound to our chassis#033[00m
Oct  2 08:29:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:30Z|00505|binding|INFO|Setting lport 8763ebde-baaa-4e83-93c4-797a04c46f19 ovn-installed in OVS
Oct  2 08:29:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:30Z|00506|binding|INFO|Setting lport 8763ebde-baaa-4e83-93c4-797a04c46f19 up in Southbound
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.157 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1e988aee-a2b2-4397-94b7-24abbf801ed0#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.172 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[094a76ef-51a3-4f56-8b82-b1df6f515ec6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.173 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1e988aee-a1 in ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.174 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1e988aee-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.175 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e13d025a-b973-4e54-9d2d-c09898cb5a61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 systemd-udevd[241137]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.175 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0db3584f-f04e-463b-8136-ed9a7dbfe0dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.189 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[b550bf7e-1aaa-4b9e-833e-9427190931d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 NetworkManager[51207]: <info>  [1759408170.1929] device (tap8763ebde-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:29:30 np0005466012 NetworkManager[51207]: <info>  [1759408170.1935] device (tap8763ebde-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:29:30 np0005466012 systemd-machined[152114]: New machine qemu-61-instance-00000084.
Oct  2 08:29:30 np0005466012 systemd[1]: Started Virtual Machine qemu-61-instance-00000084.
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.222 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[64695aef-33ce-4fde-93cc-949c27d8b897]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.266 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba6f056-5778-4070-a791-ee849ca6f109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 NetworkManager[51207]: <info>  [1759408170.2726] manager: (tap1e988aee-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.272 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a6da78f8-0fd7-4210-80ef-4b46a31e8b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.314 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[008167f5-4ec0-4b32-b7e5-3c724bb4373f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.317 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[699db6be-66eb-4a88-8860-bc053e09f9d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 NetworkManager[51207]: <info>  [1759408170.3460] device (tap1e988aee-a0): carrier: link connected
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.352 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb8647e-c9cc-46e0-96a6-313525241341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.384 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[68ad1895-762f-4ef9-b26b-66c4d4f2a47e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e988aee-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:43:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616395, 'reachable_time': 43388, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241170, 'error': None, 'target': 'ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.408 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[974bf223-4251-40fb-9ad6-73ae3ca651f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:43b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616395, 'tstamp': 616395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241171, 'error': None, 'target': 'ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.437 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[32ec1ac2-6199-4596-bd51-dbd507aafaf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e988aee-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:43:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616395, 'reachable_time': 43388, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241173, 'error': None, 'target': 'ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.476 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd75587-bcf7-40f1-bae8-6e68966cba45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.549 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[30682d16-4756-4bac-a66c-b158a18b6094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.551 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e988aee-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.552 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.552 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e988aee-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:30 np0005466012 NetworkManager[51207]: <info>  [1759408170.5551] manager: (tap1e988aee-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 kernel: tap1e988aee-a0: entered promiscuous mode
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.558 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1e988aee-a0, col_values=(('external_ids', {'iface-id': '41a11db5-3fd0-4e46-8c80-eb5874b85d36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:30Z|00507|binding|INFO|Releasing lport 41a11db5-3fd0-4e46-8c80-eb5874b85d36 from this chassis (sb_readonly=0)
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.573 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1e988aee-a2b2-4397-94b7-24abbf801ed0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1e988aee-a2b2-4397-94b7-24abbf801ed0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.574 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbacfd0-168d-45cc-80f9-df86ce73b2d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.575 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-1e988aee-a2b2-4397-94b7-24abbf801ed0
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/1e988aee-a2b2-4397-94b7-24abbf801ed0.pid.haproxy
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 1e988aee-a2b2-4397-94b7-24abbf801ed0
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:29:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:30.576 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'env', 'PROCESS_TAG=haproxy-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1e988aee-a2b2-4397-94b7-24abbf801ed0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.912 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408170.9114594, 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:30 np0005466012 nova_compute[192063]: 2025-10-02 12:29:30.913 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] VM Started (Lifecycle Event)#033[00m
Oct  2 08:29:31 np0005466012 podman[241209]: 2025-10-02 12:29:30.926498663 +0000 UTC m=+0.025086362 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:29:31 np0005466012 podman[241209]: 2025-10-02 12:29:31.153985104 +0000 UTC m=+0.252572783 container create 974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.174 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.178 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408170.911742, 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.178 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.209 2 DEBUG nova.compute.manager [req-9f8de2c0-66c7-4827-b896-17a4a04bff86 req-94877d25-7386-47b5-ac03-212e68da4a27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.210 2 DEBUG oslo_concurrency.lockutils [req-9f8de2c0-66c7-4827-b896-17a4a04bff86 req-94877d25-7386-47b5-ac03-212e68da4a27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.210 2 DEBUG oslo_concurrency.lockutils [req-9f8de2c0-66c7-4827-b896-17a4a04bff86 req-94877d25-7386-47b5-ac03-212e68da4a27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.210 2 DEBUG oslo_concurrency.lockutils [req-9f8de2c0-66c7-4827-b896-17a4a04bff86 req-94877d25-7386-47b5-ac03-212e68da4a27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.210 2 DEBUG nova.compute.manager [req-9f8de2c0-66c7-4827-b896-17a4a04bff86 req-94877d25-7386-47b5-ac03-212e68da4a27 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Processing event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.211 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.215 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.219 2 INFO nova.virt.libvirt.driver [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Instance spawned successfully.#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.220 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.310 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.313 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408171.2149537, 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.314 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.573 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.573 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.574 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.574 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.574 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.575 2 DEBUG nova.virt.libvirt.driver [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:29:31 np0005466012 systemd[1]: Started libpod-conmon-974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29.scope.
Oct  2 08:29:31 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:29:31 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dec1105bb6585aa52d11ea2b10d567a33cbd35cce3b6ef4fca879282fcb8eb4f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.714 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.720 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:29:31 np0005466012 podman[241209]: 2025-10-02 12:29:31.823708168 +0000 UTC m=+0.922295867 container init 974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  2 08:29:31 np0005466012 podman[241209]: 2025-10-02 12:29:31.830251048 +0000 UTC m=+0.928838727 container start 974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  2 08:29:31 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [NOTICE]   (241226) : New worker (241228) forked
Oct  2 08:29:31 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [NOTICE]   (241226) : Loading success.
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.863 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.891 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:31 np0005466012 nova_compute[192063]: 2025-10-02 12:29:31.892 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:32 np0005466012 nova_compute[192063]: 2025-10-02 12:29:32.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:32 np0005466012 nova_compute[192063]: 2025-10-02 12:29:32.370 2 INFO nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Took 9.50 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:29:32 np0005466012 nova_compute[192063]: 2025-10-02 12:29:32.371 2 DEBUG nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:33 np0005466012 nova_compute[192063]: 2025-10-02 12:29:33.616 2 INFO nova.compute.manager [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Took 13.47 seconds to build instance.#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.051 2 DEBUG oslo_concurrency.lockutils [None req-82873a9f-f7db-476c-a768-f6a50f8fc211 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.070 2 DEBUG nova.compute.manager [req-63737b20-0cd3-4dc0-a5f0-272d8615ade1 req-bf87f95d-abce-46ef-a02f-aa8842fdde39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.071 2 DEBUG oslo_concurrency.lockutils [req-63737b20-0cd3-4dc0-a5f0-272d8615ade1 req-bf87f95d-abce-46ef-a02f-aa8842fdde39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.071 2 DEBUG oslo_concurrency.lockutils [req-63737b20-0cd3-4dc0-a5f0-272d8615ade1 req-bf87f95d-abce-46ef-a02f-aa8842fdde39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.071 2 DEBUG oslo_concurrency.lockutils [req-63737b20-0cd3-4dc0-a5f0-272d8615ade1 req-bf87f95d-abce-46ef-a02f-aa8842fdde39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.072 2 DEBUG nova.compute.manager [req-63737b20-0cd3-4dc0-a5f0-272d8615ade1 req-bf87f95d-abce-46ef-a02f-aa8842fdde39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] No waiting events found dispatching network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.072 2 WARNING nova.compute.manager [req-63737b20-0cd3-4dc0-a5f0-272d8615ade1 req-bf87f95d-abce-46ef-a02f-aa8842fdde39 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received unexpected event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:34 np0005466012 nova_compute[192063]: 2025-10-02 12:29:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.080 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.081 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.081 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.081 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.081 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.106 2 INFO nova.compute.manager [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Terminating instance#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.125 2 DEBUG nova.compute.manager [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:29:36 np0005466012 kernel: tap8763ebde-ba (unregistering): left promiscuous mode
Oct  2 08:29:36 np0005466012 NetworkManager[51207]: <info>  [1759408176.1444] device (tap8763ebde-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:36Z|00508|binding|INFO|Releasing lport 8763ebde-baaa-4e83-93c4-797a04c46f19 from this chassis (sb_readonly=0)
Oct  2 08:29:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:36Z|00509|binding|INFO|Setting lport 8763ebde-baaa-4e83-93c4-797a04c46f19 down in Southbound
Oct  2 08:29:36 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:36Z|00510|binding|INFO|Removing iface tap8763ebde-ba ovn-installed in OVS
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.167 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:1c:50 10.100.0.13'], port_security=['fa:16:3e:18:1c:50 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77d1e2844ab74aec8a73a999db6939a9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd224aacb-8a54-4684-ab70-a87b79ba9529', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d24a4df6-9f71-4eba-adf5-cef03523ae79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=8763ebde-baaa-4e83-93c4-797a04c46f19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.168 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 8763ebde-baaa-4e83-93c4-797a04c46f19 in datapath 1e988aee-a2b2-4397-94b7-24abbf801ed0 unbound from our chassis#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.170 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1e988aee-a2b2-4397-94b7-24abbf801ed0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.171 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f493e3-1e33-49c9-9db1-afd1110531df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.172 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0 namespace which is not needed anymore#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000084.scope: Deactivated successfully.
Oct  2 08:29:36 np0005466012 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000084.scope: Consumed 5.584s CPU time.
Oct  2 08:29:36 np0005466012 systemd-machined[152114]: Machine qemu-61-instance-00000084 terminated.
Oct  2 08:29:36 np0005466012 podman[241239]: 2025-10-02 12:29:36.278007361 +0000 UTC m=+0.092022558 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:29:36 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [NOTICE]   (241226) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:36 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [NOTICE]   (241226) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:36 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [WARNING]  (241226) : Exiting Master process...
Oct  2 08:29:36 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [ALERT]    (241226) : Current worker (241228) exited with code 143 (Terminated)
Oct  2 08:29:36 np0005466012 neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0[241222]: [WARNING]  (241226) : All workers exited. Exiting... (0)
Oct  2 08:29:36 np0005466012 systemd[1]: libpod-974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29.scope: Deactivated successfully.
Oct  2 08:29:36 np0005466012 podman[241280]: 2025-10-02 12:29:36.363935389 +0000 UTC m=+0.077742133 container died 974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.393 2 INFO nova.virt.libvirt.driver [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Instance destroyed successfully.#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.393 2 DEBUG nova.objects.instance [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lazy-loading 'resources' on Instance uuid 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:36 np0005466012 systemd[1]: var-lib-containers-storage-overlay-dec1105bb6585aa52d11ea2b10d567a33cbd35cce3b6ef4fca879282fcb8eb4f-merged.mount: Deactivated successfully.
Oct  2 08:29:36 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.423 2 DEBUG nova.virt.libvirt.vif [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:29:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1549769307',display_name='tempest-ServerAddressesTestJSON-server-1549769307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1549769307',id=132,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='77d1e2844ab74aec8a73a999db6939a9',ramdisk_id='',reservation_id='r-ubdhj4cw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-2127548414',owner_user_name='tempest-ServerAddressesTestJSON-2127548414-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:29:32Z,user_data=None,user_id='9b8eff28d6954bb5b8e944809b021cc3',uuid=4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.424 2 DEBUG nova.network.os_vif_util [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Converting VIF {"id": "8763ebde-baaa-4e83-93c4-797a04c46f19", "address": "fa:16:3e:18:1c:50", "network": {"id": "1e988aee-a2b2-4397-94b7-24abbf801ed0", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2032824850-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77d1e2844ab74aec8a73a999db6939a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8763ebde-ba", "ovs_interfaceid": "8763ebde-baaa-4e83-93c4-797a04c46f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.424 2 DEBUG nova.network.os_vif_util [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.425 2 DEBUG os_vif [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.426 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8763ebde-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.430 2 INFO os_vif [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:1c:50,bridge_name='br-int',has_traffic_filtering=True,id=8763ebde-baaa-4e83-93c4-797a04c46f19,network=Network(1e988aee-a2b2-4397-94b7-24abbf801ed0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8763ebde-ba')#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.431 2 INFO nova.virt.libvirt.driver [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Deleting instance files /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7_del#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.432 2 INFO nova.virt.libvirt.driver [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Deletion of /var/lib/nova/instances/4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7_del complete#033[00m
Oct  2 08:29:36 np0005466012 podman[241280]: 2025-10-02 12:29:36.446504006 +0000 UTC m=+0.160310750 container cleanup 974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:36 np0005466012 systemd[1]: libpod-conmon-974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29.scope: Deactivated successfully.
Oct  2 08:29:36 np0005466012 podman[241281]: 2025-10-02 12:29:36.472878333 +0000 UTC m=+0.165055221 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:36 np0005466012 podman[241349]: 2025-10-02 12:29:36.538746469 +0000 UTC m=+0.072985103 container remove 974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.546 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b91b2c-9b67-479c-acaa-c15f595c0d12]: (4, ('Thu Oct  2 12:29:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0 (974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29)\n974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29\nThu Oct  2 12:29:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0 (974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29)\n974c14da9988073f5eca70299bbfcdc387c468f173a2fbd28bab4e887257ba29\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.548 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5925c5ca-bad3-4eee-8f5c-0bc1a4218865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.549 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e988aee-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 kernel: tap1e988aee-a0: left promiscuous mode
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.565 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[824c6ee1-11d2-4e1a-8f1c-a73636a257e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.595 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[10a3d8ec-534d-42ee-9eac-927777c58d93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.596 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[04d53e98-50fe-42ef-af53-991f0d96a616]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.612 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[63928ed8-be58-4280-99d1-2fe9bdcff82b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616386, 'reachable_time': 21249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241368, 'error': None, 'target': 'ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 systemd[1]: run-netns-ovnmeta\x2d1e988aee\x2da2b2\x2d4397\x2d94b7\x2d24abbf801ed0.mount: Deactivated successfully.
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.616 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1e988aee-a2b2-4397-94b7-24abbf801ed0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:36 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:36.616 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[43c7c58c-c4f1-407b-8c47-23ad993043a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.623 2 INFO nova.compute.manager [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.624 2 DEBUG oslo.service.loopingcall [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.624 2 DEBUG nova.compute.manager [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.625 2 DEBUG nova.network.neutron [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.761 2 DEBUG nova.compute.manager [req-803e0dc7-9c6e-4198-8209-b128f2ba480c req-cc016e11-5b5d-425a-82cc-deea41fb4447 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-vif-unplugged-8763ebde-baaa-4e83-93c4-797a04c46f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.761 2 DEBUG oslo_concurrency.lockutils [req-803e0dc7-9c6e-4198-8209-b128f2ba480c req-cc016e11-5b5d-425a-82cc-deea41fb4447 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.761 2 DEBUG oslo_concurrency.lockutils [req-803e0dc7-9c6e-4198-8209-b128f2ba480c req-cc016e11-5b5d-425a-82cc-deea41fb4447 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.762 2 DEBUG oslo_concurrency.lockutils [req-803e0dc7-9c6e-4198-8209-b128f2ba480c req-cc016e11-5b5d-425a-82cc-deea41fb4447 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.762 2 DEBUG nova.compute.manager [req-803e0dc7-9c6e-4198-8209-b128f2ba480c req-cc016e11-5b5d-425a-82cc-deea41fb4447 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] No waiting events found dispatching network-vif-unplugged-8763ebde-baaa-4e83-93c4-797a04c46f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.762 2 DEBUG nova.compute.manager [req-803e0dc7-9c6e-4198-8209-b128f2ba480c req-cc016e11-5b5d-425a-82cc-deea41fb4447 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-vif-unplugged-8763ebde-baaa-4e83-93c4-797a04c46f19 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:29:36 np0005466012 nova_compute[192063]: 2025-10-02 12:29:36.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:37 np0005466012 nova_compute[192063]: 2025-10-02 12:29:37.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.203 2 DEBUG nova.network.neutron [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.220 2 DEBUG nova.compute.manager [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-changed-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.220 2 DEBUG nova.compute.manager [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing instance network info cache due to event network-changed-ff0fff12-a049-4dbc-b470-6f1a5654244c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.220 2 DEBUG oslo_concurrency.lockutils [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.221 2 DEBUG oslo_concurrency.lockutils [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.221 2 DEBUG nova.network.neutron [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Refreshing network info cache for port ff0fff12-a049-4dbc-b470-6f1a5654244c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.249 2 INFO nova.compute.manager [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Took 1.62 seconds to deallocate network for instance.#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.362 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.362 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.461 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.461 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.461 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.462 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.462 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.480 2 INFO nova.compute.manager [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Terminating instance#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.496 2 DEBUG nova.compute.provider_tree [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.499 2 DEBUG nova.compute.manager [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.525 2 DEBUG nova.scheduler.client.report [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:38 np0005466012 kernel: tapff0fff12-a0 (unregistering): left promiscuous mode
Oct  2 08:29:38 np0005466012 NetworkManager[51207]: <info>  [1759408178.5395] device (tapff0fff12-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:38Z|00511|binding|INFO|Releasing lport ff0fff12-a049-4dbc-b470-6f1a5654244c from this chassis (sb_readonly=0)
Oct  2 08:29:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:38Z|00512|binding|INFO|Setting lport ff0fff12-a049-4dbc-b470-6f1a5654244c down in Southbound
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:38Z|00513|binding|INFO|Removing iface tapff0fff12-a0 ovn-installed in OVS
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.557 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:b6:e1 10.100.0.4'], port_security=['fa:16:3e:9f:b6:e1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a731a9f5-9e55-440a-a95e-a9a819598de7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=ff0fff12-a049-4dbc-b470-6f1a5654244c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.559 103246 INFO neutron.agent.ovn.metadata.agent [-] Port ff0fff12-a049-4dbc-b470-6f1a5654244c in datapath d68eafa7-b35f-4bd9-ba11-e28a73bc7849 unbound from our chassis#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.560 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d68eafa7-b35f-4bd9-ba11-e28a73bc7849, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.561 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eaed42f9-7165-4709-8bab-df6156aa193a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.561 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 namespace which is not needed anymore#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.573 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:38 np0005466012 kernel: tap7a540c18-fe (unregistering): left promiscuous mode
Oct  2 08:29:38 np0005466012 NetworkManager[51207]: <info>  [1759408178.5916] device (tap7a540c18-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:29:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:38Z|00514|binding|INFO|Releasing lport 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 from this chassis (sb_readonly=0)
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:38Z|00515|binding|INFO|Setting lport 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 down in Southbound
Oct  2 08:29:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:29:38Z|00516|binding|INFO|Removing iface tap7a540c18-fe ovn-installed in OVS
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.607 2 INFO nova.scheduler.client.report [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Deleted allocations for instance 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.615 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:6c:ea 2001:db8::f816:3eff:fea5:6cea'], port_security=['fa:16:3e:a5:6c:ea 2001:db8::f816:3eff:fea5:6cea'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea5:6cea/64', 'neutron:device_id': '619c3e72-5be3-417a-8ae9-25ef4b63a50d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85092873-751b-414a-a9a1-112c2e61cb13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1049901a-232c-40d0-9fe6-646c9d087089', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1cb8f94-a0b5-458e-a15a-45916ae4369f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7a540c18-fe1e-41b7-b421-d10a6e2d3f73) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000081.scope: Deactivated successfully.
Oct  2 08:29:38 np0005466012 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000081.scope: Consumed 15.283s CPU time.
Oct  2 08:29:38 np0005466012 systemd-machined[152114]: Machine qemu-60-instance-00000081 terminated.
Oct  2 08:29:38 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [NOTICE]   (240918) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:38 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [NOTICE]   (240918) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:38 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [WARNING]  (240918) : Exiting Master process...
Oct  2 08:29:38 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [ALERT]    (240918) : Current worker (240920) exited with code 143 (Terminated)
Oct  2 08:29:38 np0005466012 neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849[240908]: [WARNING]  (240918) : All workers exited. Exiting... (0)
Oct  2 08:29:38 np0005466012 systemd[1]: libpod-4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86.scope: Deactivated successfully.
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.714 2 DEBUG oslo_concurrency.lockutils [None req-feb5b5ae-6327-4698-bffc-fef3d3157cfc 9b8eff28d6954bb5b8e944809b021cc3 77d1e2844ab74aec8a73a999db6939a9 - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:38 np0005466012 podman[241398]: 2025-10-02 12:29:38.720579476 +0000 UTC m=+0.047920533 container died 4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:29:38 np0005466012 NetworkManager[51207]: <info>  [1759408178.7305] manager: (tap7a540c18-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/233)
Oct  2 08:29:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay-e60832e9edb6745632176569bbc436f8e8dd0cc9790dc7596f2fddd6acc50ca1-merged.mount: Deactivated successfully.
Oct  2 08:29:38 np0005466012 podman[241398]: 2025-10-02 12:29:38.753627207 +0000 UTC m=+0.080968284 container cleanup 4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:29:38 np0005466012 systemd[1]: libpod-conmon-4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86.scope: Deactivated successfully.
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.784 2 INFO nova.virt.libvirt.driver [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Instance destroyed successfully.#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.785 2 DEBUG nova.objects.instance [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 619c3e72-5be3-417a-8ae9-25ef4b63a50d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.799 2 DEBUG nova.virt.libvirt.vif [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-743984661',display_name='tempest-TestGettingAddress-server-743984661',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-743984661',id=129,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-bfop6dze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:29:11Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619c3e72-5be3-417a-8ae9-25ef4b63a50d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.799 2 DEBUG nova.network.os_vif_util [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.800 2 DEBUG nova.network.os_vif_util [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.801 2 DEBUG os_vif [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.803 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff0fff12-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.811 2 INFO os_vif [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:b6:e1,bridge_name='br-int',has_traffic_filtering=True,id=ff0fff12-a049-4dbc-b470-6f1a5654244c,network=Network(d68eafa7-b35f-4bd9-ba11-e28a73bc7849),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff0fff12-a0')#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.812 2 DEBUG nova.virt.libvirt.vif [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-743984661',display_name='tempest-TestGettingAddress-server-743984661',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-743984661',id=129,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNuPkj9Dxk75qecKTcAv5uyglWO7VJAWxaaepcSp1a1k0dudaz78GBONfraj5VI3+kTW1O5IZNXJG+3u7RZIiYOa8MpOz6jMbMxt8apN5oZQHFIm8Zmccy1CmZp9PKlQBg==',key_name='tempest-TestGettingAddress-1196931169',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:29:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-bfop6dze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:29:11Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619c3e72-5be3-417a-8ae9-25ef4b63a50d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.812 2 DEBUG nova.network.os_vif_util [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.813 2 DEBUG nova.network.os_vif_util [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.814 2 DEBUG os_vif [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a540c18-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.819 2 INFO os_vif [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:6c:ea,bridge_name='br-int',has_traffic_filtering=True,id=7a540c18-fe1e-41b7-b421-d10a6e2d3f73,network=Network(85092873-751b-414a-a9a1-112c2e61cb13),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a540c18-fe')#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.820 2 INFO nova.virt.libvirt.driver [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Deleting instance files /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d_del#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.820 2 INFO nova.virt.libvirt.driver [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Deletion of /var/lib/nova/instances/619c3e72-5be3-417a-8ae9-25ef4b63a50d_del complete#033[00m
Oct  2 08:29:38 np0005466012 podman[241449]: 2025-10-02 12:29:38.824706417 +0000 UTC m=+0.043441899 container remove 4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.831 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b7e012-22c1-4003-b1e5-6e92e4604435]: (4, ('Thu Oct  2 12:29:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 (4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86)\n4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86\nThu Oct  2 12:29:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 (4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86)\n4d01b910026e9a912760301527689148b1cce9feaddbaa2ed9a61d26d52f1e86\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.832 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a11d4333-df2d-4511-ab50-571ca5352e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.833 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd68eafa7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 kernel: tapd68eafa7-b0: left promiscuous mode
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.849 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1a06c6-bb5a-4f26-b922-a23d7071b0a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.893 2 DEBUG nova.compute.manager [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.894 2 DEBUG oslo_concurrency.lockutils [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.894 2 DEBUG oslo_concurrency.lockutils [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.894 2 DEBUG oslo_concurrency.lockutils [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.894 2 DEBUG nova.compute.manager [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] No waiting events found dispatching network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.894 2 WARNING nova.compute.manager [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received unexpected event network-vif-plugged-8763ebde-baaa-4e83-93c4-797a04c46f19 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.895 2 DEBUG nova.compute.manager [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-unplugged-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.895 2 DEBUG oslo_concurrency.lockutils [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.895 2 DEBUG oslo_concurrency.lockutils [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.895 2 DEBUG oslo_concurrency.lockutils [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.895 2 DEBUG nova.compute.manager [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] No waiting events found dispatching network-vif-unplugged-ff0fff12-a049-4dbc-b470-6f1a5654244c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.895 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbb55dc-74d2-43ce-99e1-7970e1413cbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.895 2 DEBUG nova.compute.manager [req-7aa52057-0e42-4f01-96ae-c69ccd853537 req-af57eebb-4965-45cc-a937-8a5dbd8b7c51 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-unplugged-ff0fff12-a049-4dbc-b470-6f1a5654244c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.897 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c8dd39c2-2409-4f3a-9d31-c2c72c2a1eb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.916 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9f876b41-e1d8-4bd8-8a20-0620da7c6248]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614325, 'reachable_time': 15555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241467, 'error': None, 'target': 'ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.919 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d68eafa7-b35f-4bd9-ba11-e28a73bc7849 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.919 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5fcd74-b7e6-4f4c-940f-a12b8652fbc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.919 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7a540c18-fe1e-41b7-b421-d10a6e2d3f73 in datapath 85092873-751b-414a-a9a1-112c2e61cb13 unbound from our chassis#033[00m
Oct  2 08:29:38 np0005466012 systemd[1]: run-netns-ovnmeta\x2dd68eafa7\x2db35f\x2d4bd9\x2dba11\x2de28a73bc7849.mount: Deactivated successfully.
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.921 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85092873-751b-414a-a9a1-112c2e61cb13, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.922 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad3d1e0-bff0-4427-b8fc-6612c0090fae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:38.922 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 namespace which is not needed anymore#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.942 2 INFO nova.compute.manager [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.942 2 DEBUG oslo.service.loopingcall [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.942 2 DEBUG nova.compute.manager [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.943 2 DEBUG nova.network.neutron [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.954 2 DEBUG nova.compute.manager [req-0f94baaa-e0fc-48af-8413-062d8226aa87 req-5ef63c8d-cdfe-40ce-9386-1c1604ab96cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-unplugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.954 2 DEBUG oslo_concurrency.lockutils [req-0f94baaa-e0fc-48af-8413-062d8226aa87 req-5ef63c8d-cdfe-40ce-9386-1c1604ab96cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.954 2 DEBUG oslo_concurrency.lockutils [req-0f94baaa-e0fc-48af-8413-062d8226aa87 req-5ef63c8d-cdfe-40ce-9386-1c1604ab96cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.954 2 DEBUG oslo_concurrency.lockutils [req-0f94baaa-e0fc-48af-8413-062d8226aa87 req-5ef63c8d-cdfe-40ce-9386-1c1604ab96cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.955 2 DEBUG nova.compute.manager [req-0f94baaa-e0fc-48af-8413-062d8226aa87 req-5ef63c8d-cdfe-40ce-9386-1c1604ab96cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] No waiting events found dispatching network-vif-unplugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:38 np0005466012 nova_compute[192063]: 2025-10-02 12:29:38.955 2 DEBUG nova.compute.manager [req-0f94baaa-e0fc-48af-8413-062d8226aa87 req-5ef63c8d-cdfe-40ce-9386-1c1604ab96cd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-unplugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:29:39 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [NOTICE]   (240991) : haproxy version is 2.8.14-c23fe91
Oct  2 08:29:39 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [NOTICE]   (240991) : path to executable is /usr/sbin/haproxy
Oct  2 08:29:39 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [WARNING]  (240991) : Exiting Master process...
Oct  2 08:29:39 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [ALERT]    (240991) : Current worker (240993) exited with code 143 (Terminated)
Oct  2 08:29:39 np0005466012 neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13[240987]: [WARNING]  (240991) : All workers exited. Exiting... (0)
Oct  2 08:29:39 np0005466012 systemd[1]: libpod-71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963.scope: Deactivated successfully.
Oct  2 08:29:39 np0005466012 podman[241485]: 2025-10-02 12:29:39.048281129 +0000 UTC m=+0.046500722 container died 71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:29:39 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963-userdata-shm.mount: Deactivated successfully.
Oct  2 08:29:39 np0005466012 systemd[1]: var-lib-containers-storage-overlay-174aafcc9d5d50d6757543c8314645826d04dc686ea822120ce859f4516ce5d2-merged.mount: Deactivated successfully.
Oct  2 08:29:39 np0005466012 podman[241485]: 2025-10-02 12:29:39.084149568 +0000 UTC m=+0.082369171 container cleanup 71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:39 np0005466012 systemd[1]: libpod-conmon-71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963.scope: Deactivated successfully.
Oct  2 08:29:39 np0005466012 podman[241516]: 2025-10-02 12:29:39.144426481 +0000 UTC m=+0.040278682 container remove 71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.150 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c18e91f9-8d2a-4ad3-b476-862c98c86c52]: (4, ('Thu Oct  2 12:29:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 (71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963)\n71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963\nThu Oct  2 12:29:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 (71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963)\n71c52ac4dc217661024730838d6f1308a13cbd02700bba5a2b6e2d53c8667963\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.151 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca9efce-906a-421b-b763-3613825dc6ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.152 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85092873-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:29:39 np0005466012 nova_compute[192063]: 2025-10-02 12:29:39.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:39 np0005466012 kernel: tap85092873-70: left promiscuous mode
Oct  2 08:29:39 np0005466012 nova_compute[192063]: 2025-10-02 12:29:39.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:39 np0005466012 nova_compute[192063]: 2025-10-02 12:29:39.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.168 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[574cd26d-3ed5-4581-b953-4860abac7217]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.203 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[efe373fc-9ccc-493a-9636-b90054e7c306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.205 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[013334c8-64ff-48b7-b027-88df2a60e356]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.223 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[19c6bea1-c951-4aa7-b446-67df904787be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614427, 'reachable_time': 20058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241535, 'error': None, 'target': 'ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.226 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85092873-751b-414a-a9a1-112c2e61cb13 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:29:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:29:39.226 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[c461bc8b-ebac-429d-9997-2bcc589bb825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:29:39 np0005466012 podman[241531]: 2025-10-02 12:29:39.268620394 +0000 UTC m=+0.055588374 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  2 08:29:39 np0005466012 podman[241530]: 2025-10-02 12:29:39.270452495 +0000 UTC m=+0.060202271 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:29:39 np0005466012 systemd[1]: run-netns-ovnmeta\x2d85092873\x2d751b\x2d414a\x2da9a1\x2d112c2e61cb13.mount: Deactivated successfully.
Oct  2 08:29:39 np0005466012 nova_compute[192063]: 2025-10-02 12:29:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:39 np0005466012 nova_compute[192063]: 2025-10-02 12:29:39.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.325 2 DEBUG nova.network.neutron [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updated VIF entry in instance network info cache for port ff0fff12-a049-4dbc-b470-6f1a5654244c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.326 2 DEBUG nova.network.neutron [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updating instance_info_cache with network_info: [{"id": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "address": "fa:16:3e:9f:b6:e1", "network": {"id": "d68eafa7-b35f-4bd9-ba11-e28a73bc7849", "bridge": "br-int", "label": "tempest-network-smoke--995539246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff0fff12-a0", "ovs_interfaceid": "ff0fff12-a049-4dbc-b470-6f1a5654244c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "address": "fa:16:3e:a5:6c:ea", "network": {"id": "85092873-751b-414a-a9a1-112c2e61cb13", "bridge": "br-int", "label": "tempest-network-smoke--1405117352", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea5:6cea", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a540c18-fe", "ovs_interfaceid": "7a540c18-fe1e-41b7-b421-d10a6e2d3f73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.353 2 DEBUG nova.compute.manager [req-0daf96bd-a12e-4bc4-a494-aeb562964601 req-0a186c93-7d54-4611-9a5c-d19101853c0c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Received event network-vif-deleted-8763ebde-baaa-4e83-93c4-797a04c46f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.372 2 DEBUG oslo_concurrency.lockutils [req-f68a5227-e465-40d6-8aab-08d74b1ca789 req-748d34ad-88fa-4e7d-b1d8-4399fd3b3612 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619c3e72-5be3-417a-8ae9-25ef4b63a50d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.870 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.870 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.871 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.871 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:29:40 np0005466012 nova_compute[192063]: 2025-10-02 12:29:40.956 2 DEBUG nova.network.neutron [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.028 2 INFO nova.compute.manager [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Took 2.09 seconds to deallocate network for instance.#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.038 2 DEBUG nova.compute.manager [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.038 2 DEBUG oslo_concurrency.lockutils [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.039 2 DEBUG oslo_concurrency.lockutils [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.039 2 DEBUG oslo_concurrency.lockutils [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.039 2 DEBUG nova.compute.manager [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] No waiting events found dispatching network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.039 2 WARNING nova.compute.manager [req-a104cfd4-4d98-442e-a50c-61c87b795964 req-b4ba9e78-7007-402a-997c-01cec9511fae 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received unexpected event network-vif-plugged-ff0fff12-a049-4dbc-b470-6f1a5654244c for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.065 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.066 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5675MB free_disk=73.24602508544922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.066 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.066 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.111 2 DEBUG nova.compute.manager [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.111 2 DEBUG oslo_concurrency.lockutils [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.111 2 DEBUG oslo_concurrency.lockutils [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.111 2 DEBUG oslo_concurrency.lockutils [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.112 2 DEBUG nova.compute.manager [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] No waiting events found dispatching network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.112 2 WARNING nova.compute.manager [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received unexpected event network-vif-plugged-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.112 2 DEBUG nova.compute.manager [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-deleted-7a540c18-fe1e-41b7-b421-d10a6e2d3f73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.112 2 DEBUG nova.compute.manager [req-12d70bb0-7559-4505-a1b1-71d5bcee72c2 req-f5ef89cd-cd61-404c-a510-616a539cdd9c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Received event network-vif-deleted-ff0fff12-a049-4dbc-b470-6f1a5654244c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.154 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 619c3e72-5be3-417a-8ae9-25ef4b63a50d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.155 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.155 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.171 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.199 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.224 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.264 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.264 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.265 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.350 2 DEBUG nova.compute.provider_tree [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.367 2 DEBUG nova.scheduler.client.report [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.447 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.533 2 INFO nova.scheduler.client.report [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 619c3e72-5be3-417a-8ae9-25ef4b63a50d#033[00m
Oct  2 08:29:41 np0005466012 nova_compute[192063]: 2025-10-02 12:29:41.833 2 DEBUG oslo_concurrency.lockutils [None req-2ca84325-b955-4050-99fa-aa2ea25688c9 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619c3e72-5be3-417a-8ae9-25ef4b63a50d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:29:42 np0005466012 nova_compute[192063]: 2025-10-02 12:29:42.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:43 np0005466012 nova_compute[192063]: 2025-10-02 12:29:43.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:44 np0005466012 nova_compute[192063]: 2025-10-02 12:29:44.264 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:44 np0005466012 nova_compute[192063]: 2025-10-02 12:29:44.264 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:29:44 np0005466012 nova_compute[192063]: 2025-10-02 12:29:44.265 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:29:44 np0005466012 nova_compute[192063]: 2025-10-02 12:29:44.358 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:29:44 np0005466012 nova_compute[192063]: 2025-10-02 12:29:44.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005466012 nova_compute[192063]: 2025-10-02 12:29:45.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:45 np0005466012 nova_compute[192063]: 2025-10-02 12:29:45.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:47 np0005466012 nova_compute[192063]: 2025-10-02 12:29:47.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:48 np0005466012 nova_compute[192063]: 2025-10-02 12:29:48.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:49 np0005466012 podman[241575]: 2025-10-02 12:29:49.139149818 +0000 UTC m=+0.058999847 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:29:49 np0005466012 podman[241576]: 2025-10-02 12:29:49.139706853 +0000 UTC m=+0.058110203 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:29:51 np0005466012 nova_compute[192063]: 2025-10-02 12:29:51.392 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408176.3911273, 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:51 np0005466012 nova_compute[192063]: 2025-10-02 12:29:51.393 2 INFO nova.compute.manager [-] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:29:51 np0005466012 nova_compute[192063]: 2025-10-02 12:29:51.414 2 DEBUG nova.compute.manager [None req-f668c97b-de21-40fa-8405-2bf5a038a6e2 - - - - - -] [instance: 4dbc6b41-20a9-42fc-9d29-fe60ebdbe0f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:52 np0005466012 podman[241613]: 2025-10-02 12:29:52.178665539 +0000 UTC m=+0.093150369 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:29:52 np0005466012 podman[241614]: 2025-10-02 12:29:52.17904417 +0000 UTC m=+0.079452442 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:29:52 np0005466012 nova_compute[192063]: 2025-10-02 12:29:52.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:53 np0005466012 nova_compute[192063]: 2025-10-02 12:29:53.783 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408178.7822602, 619c3e72-5be3-417a-8ae9-25ef4b63a50d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:29:53 np0005466012 nova_compute[192063]: 2025-10-02 12:29:53.784 2 INFO nova.compute.manager [-] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:29:53 np0005466012 nova_compute[192063]: 2025-10-02 12:29:53.811 2 DEBUG nova.compute.manager [None req-d15e0b03-4fbf-401c-b880-e02020a9c2c3 - - - - - -] [instance: 619c3e72-5be3-417a-8ae9-25ef4b63a50d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:29:53 np0005466012 nova_compute[192063]: 2025-10-02 12:29:53.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:57 np0005466012 nova_compute[192063]: 2025-10-02 12:29:57.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:58 np0005466012 nova_compute[192063]: 2025-10-02 12:29:58.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:29:59 np0005466012 nova_compute[192063]: 2025-10-02 12:29:59.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:29:59 np0005466012 nova_compute[192063]: 2025-10-02 12:29:59.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:29:59 np0005466012 nova_compute[192063]: 2025-10-02 12:29:59.845 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:30:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:02.144 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:02.145 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:02.145 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:02 np0005466012 nova_compute[192063]: 2025-10-02 12:30:02.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:02 np0005466012 nova_compute[192063]: 2025-10-02 12:30:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:03 np0005466012 nova_compute[192063]: 2025-10-02 12:30:03.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:07 np0005466012 podman[241658]: 2025-10-02 12:30:07.125839924 +0000 UTC m=+0.044624401 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:30:07 np0005466012 podman[241659]: 2025-10-02 12:30:07.193015866 +0000 UTC m=+0.108860302 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller)
Oct  2 08:30:07 np0005466012 nova_compute[192063]: 2025-10-02 12:30:07.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:08 np0005466012 nova_compute[192063]: 2025-10-02 12:30:08.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:10 np0005466012 podman[241705]: 2025-10-02 12:30:10.129196948 +0000 UTC m=+0.049565897 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:10 np0005466012 podman[241706]: 2025-10-02 12:30:10.139463981 +0000 UTC m=+0.056336164 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  2 08:30:12 np0005466012 nova_compute[192063]: 2025-10-02 12:30:12.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:13 np0005466012 nova_compute[192063]: 2025-10-02 12:30:13.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:17 np0005466012 nova_compute[192063]: 2025-10-02 12:30:17.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:18 np0005466012 nova_compute[192063]: 2025-10-02 12:30:18.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005466012 podman[241743]: 2025-10-02 12:30:20.160575176 +0000 UTC m=+0.072611173 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:30:20 np0005466012 podman[241742]: 2025-10-02 12:30:20.165416299 +0000 UTC m=+0.079069140 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Oct  2 08:30:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:20.392 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:20.392 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:30:20 np0005466012 nova_compute[192063]: 2025-10-02 12:30:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:20.975 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:31:73 2001:db8:0:1:f816:3eff:fe2e:3173 2001:db8::f816:3eff:fe2e:3173'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe2e:3173/64 2001:db8::f816:3eff:fe2e:3173/64', 'neutron:device_id': 'ovnmeta-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=876a7f58-2645-4e1a-8a60-dbbe16fdfb2e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3eb0ed9e-d99b-4ee6-af64-ada9c8369b17) old=Port_Binding(mac=['fa:16:3e:2e:31:73 2001:db8::f816:3eff:fe2e:3173'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2e:3173/64', 'neutron:device_id': 'ovnmeta-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2520108-9d67-4d82-a7a0-ba429a88c3c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:20.977 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3eb0ed9e-d99b-4ee6-af64-ada9c8369b17 in datapath e2520108-9d67-4d82-a7a0-ba429a88c3c9 updated#033[00m
Oct  2 08:30:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:20.979 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e2520108-9d67-4d82-a7a0-ba429a88c3c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:30:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:20.981 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[57faf2c3-f6cb-4088-80fe-1fdadae4cf31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:22 np0005466012 nova_compute[192063]: 2025-10-02 12:30:22.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:23 np0005466012 podman[241783]: 2025-10-02 12:30:23.179634944 +0000 UTC m=+0.086408193 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:23 np0005466012 podman[241784]: 2025-10-02 12:30:23.181294419 +0000 UTC m=+0.093865028 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:30:23 np0005466012 nova_compute[192063]: 2025-10-02 12:30:23.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:23 np0005466012 nova_compute[192063]: 2025-10-02 12:30:23.971 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:23 np0005466012 nova_compute[192063]: 2025-10-02 12:30:23.972 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:23 np0005466012 nova_compute[192063]: 2025-10-02 12:30:23.988 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.084 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.085 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.091 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.091 2 INFO nova.compute.claims [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.218 2 DEBUG nova.compute.provider_tree [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.233 2 DEBUG nova.scheduler.client.report [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.254 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.266 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "2c95f67e-86f1-424c-bbfa-55bae3cd8f48" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.267 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "2c95f67e-86f1-424c-bbfa-55bae3cd8f48" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.275 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "2c95f67e-86f1-424c-bbfa-55bae3cd8f48" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.276 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.332 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.333 2 DEBUG nova.network.neutron [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.349 2 INFO nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.368 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.482 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.483 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.484 2 INFO nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Creating image(s)#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.484 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "/var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.485 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "/var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.485 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "/var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.500 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.519 2 DEBUG nova.policy [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d6270103ff9452cb8caedb8f707fde1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78c62dd51a744172bb1729f604397cc6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.558 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.559 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.559 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.577 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.637 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.638 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.682 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.683 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.683 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.736 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.737 2 DEBUG nova.virt.disk.api [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Checking if we can resize image /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.738 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.793 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.794 2 DEBUG nova.virt.disk.api [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Cannot resize image /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.795 2 DEBUG nova.objects.instance [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lazy-loading 'migration_context' on Instance uuid 507e9114-34cf-4091-851b-f85f4a8d9687 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.824 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.824 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Ensure instance console log exists: /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.825 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.825 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:24 np0005466012 nova_compute[192063]: 2025-10-02 12:30:24.825 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:25 np0005466012 nova_compute[192063]: 2025-10-02 12:30:25.451 2 DEBUG nova.network.neutron [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Successfully created port: 57894cd7-79bd-4e2b-bca9-1da420ad642b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.592 2 DEBUG nova.network.neutron [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Successfully updated port: 57894cd7-79bd-4e2b-bca9-1da420ad642b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.614 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "refresh_cache-507e9114-34cf-4091-851b-f85f4a8d9687" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.614 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquired lock "refresh_cache-507e9114-34cf-4091-851b-f85f4a8d9687" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.615 2 DEBUG nova.network.neutron [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.707 2 DEBUG nova.compute.manager [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-changed-57894cd7-79bd-4e2b-bca9-1da420ad642b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.708 2 DEBUG nova.compute.manager [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Refreshing instance network info cache due to event network-changed-57894cd7-79bd-4e2b-bca9-1da420ad642b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.708 2 DEBUG oslo_concurrency.lockutils [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-507e9114-34cf-4091-851b-f85f4a8d9687" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:26 np0005466012 nova_compute[192063]: 2025-10-02 12:30:26.838 2 DEBUG nova.network.neutron [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:30:27 np0005466012 nova_compute[192063]: 2025-10-02 12:30:27.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.181 2 DEBUG nova.network.neutron [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Updating instance_info_cache with network_info: [{"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.224 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Releasing lock "refresh_cache-507e9114-34cf-4091-851b-f85f4a8d9687" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.225 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Instance network_info: |[{"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.226 2 DEBUG oslo_concurrency.lockutils [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-507e9114-34cf-4091-851b-f85f4a8d9687" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.226 2 DEBUG nova.network.neutron [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Refreshing network info cache for port 57894cd7-79bd-4e2b-bca9-1da420ad642b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.231 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Start _get_guest_xml network_info=[{"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.240 2 WARNING nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.254 2 DEBUG nova.virt.libvirt.host [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.255 2 DEBUG nova.virt.libvirt.host [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.259 2 DEBUG nova.virt.libvirt.host [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.260 2 DEBUG nova.virt.libvirt.host [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.262 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.262 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.263 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.264 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.264 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.264 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.265 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.265 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.266 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.267 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.267 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.267 2 DEBUG nova.virt.hardware [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.276 2 DEBUG nova.virt.libvirt.vif [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-667443961',display_name='tempest-ServerGroupTestJSON-server-667443961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-667443961',id=133,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78c62dd51a744172bb1729f604397cc6',ramdisk_id='',reservation_id='r-krr8luyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1947845073',owner_user_name='tempest-ServerGroupTestJSON-1947845073-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:24Z,user_data=None,user_id='0d6270103ff9452cb8caedb8f707fde1',uuid=507e9114-34cf-4091-851b-f85f4a8d9687,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.277 2 DEBUG nova.network.os_vif_util [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Converting VIF {"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.278 2 DEBUG nova.network.os_vif_util [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.279 2 DEBUG nova.objects.instance [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 507e9114-34cf-4091-851b-f85f4a8d9687 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.295 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <uuid>507e9114-34cf-4091-851b-f85f4a8d9687</uuid>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <name>instance-00000085</name>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServerGroupTestJSON-server-667443961</nova:name>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:30:28</nova:creationTime>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:user uuid="0d6270103ff9452cb8caedb8f707fde1">tempest-ServerGroupTestJSON-1947845073-project-member</nova:user>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:project uuid="78c62dd51a744172bb1729f604397cc6">tempest-ServerGroupTestJSON-1947845073</nova:project>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        <nova:port uuid="57894cd7-79bd-4e2b-bca9-1da420ad642b">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <entry name="serial">507e9114-34cf-4091-851b-f85f4a8d9687</entry>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <entry name="uuid">507e9114-34cf-4091-851b-f85f4a8d9687</entry>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.config"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:85:cb:bb"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <target dev="tap57894cd7-79"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/console.log" append="off"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:30:28 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:30:28 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:30:28 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:30:28 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.296 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Preparing to wait for external event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.296 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.297 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.297 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.298 2 DEBUG nova.virt.libvirt.vif [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-667443961',display_name='tempest-ServerGroupTestJSON-server-667443961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-667443961',id=133,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78c62dd51a744172bb1729f604397cc6',ramdisk_id='',reservation_id='r-krr8luyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1947845073',owner_user_name='tempest-ServerGroupTestJSON-1947845073-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:24Z,user_data=None,user_id='0d6270103ff9452cb8caedb8f707fde1',uuid=507e9114-34cf-4091-851b-f85f4a8d9687,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.298 2 DEBUG nova.network.os_vif_util [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Converting VIF {"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.298 2 DEBUG nova.network.os_vif_util [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.299 2 DEBUG os_vif [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.299 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.300 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57894cd7-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57894cd7-79, col_values=(('external_ids', {'iface-id': '57894cd7-79bd-4e2b-bca9-1da420ad642b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:cb:bb', 'vm-uuid': '507e9114-34cf-4091-851b-f85f4a8d9687'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:28 np0005466012 NetworkManager[51207]: <info>  [1759408228.3054] manager: (tap57894cd7-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.312 2 INFO os_vif [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79')#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.361 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.362 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.362 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] No VIF found with MAC fa:16:3e:85:cb:bb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:28 np0005466012 nova_compute[192063]: 2025-10-02 12:30:28.362 2 INFO nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Using config drive#033[00m
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.179 2 INFO nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Creating config drive at /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.config#033[00m
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.183 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjzxhwrr6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.306 2 DEBUG oslo_concurrency.processutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjzxhwrr6" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:29 np0005466012 kernel: tap57894cd7-79: entered promiscuous mode
Oct  2 08:30:29 np0005466012 NetworkManager[51207]: <info>  [1759408229.3688] manager: (tap57894cd7-79): new Tun device (/org/freedesktop/NetworkManager/Devices/235)
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:29Z|00517|binding|INFO|Claiming lport 57894cd7-79bd-4e2b-bca9-1da420ad642b for this chassis.
Oct  2 08:30:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:29Z|00518|binding|INFO|57894cd7-79bd-4e2b-bca9-1da420ad642b: Claiming fa:16:3e:85:cb:bb 10.100.0.13
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.396 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:cb:bb 10.100.0.13'], port_security=['fa:16:3e:85:cb:bb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '507e9114-34cf-4091-851b-f85f4a8d9687', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78c62dd51a744172bb1729f604397cc6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d4b8bdb-44ab-4501-ac8b-0476efc460ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfd38043-23cb-47f2-8769-74841907507c, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=57894cd7-79bd-4e2b-bca9-1da420ad642b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:29 np0005466012 systemd-udevd[241862]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.397 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.398 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 57894cd7-79bd-4e2b-bca9-1da420ad642b in datapath d75b3f9f-55df-4cd2-9c54-63280bbc840b bound to our chassis#033[00m
Oct  2 08:30:29 np0005466012 systemd-machined[152114]: New machine qemu-62-instance-00000085.
Oct  2 08:30:29 np0005466012 NetworkManager[51207]: <info>  [1759408229.4100] device (tap57894cd7-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:30:29 np0005466012 NetworkManager[51207]: <info>  [1759408229.4109] device (tap57894cd7-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.417 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d75b3f9f-55df-4cd2-9c54-63280bbc840b#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.429 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd1d8fb-a0d0-414a-a82b-2e7cb8c22c5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.429 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd75b3f9f-51 in ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.431 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd75b3f9f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.432 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e43cccf6-a014-471e-9cd0-e8ccee9a9455]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.432 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[92c125ac-c909-4aa0-bbcd-264b96f3d500]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 systemd[1]: Started Virtual Machine qemu-62-instance-00000085.
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:29Z|00519|binding|INFO|Setting lport 57894cd7-79bd-4e2b-bca9-1da420ad642b ovn-installed in OVS
Oct  2 08:30:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:29Z|00520|binding|INFO|Setting lport 57894cd7-79bd-4e2b-bca9-1da420ad642b up in Southbound
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.445 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[76ad5fda-d7f5-491e-b135-89100020c4f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.471 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b61485-7848-4a46-a4d8-b245b52841cc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.503 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[843c2f29-c0e0-48c9-8028-3dc4dc548a84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 systemd-udevd[241865]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.510 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[872416df-fdca-42dc-8583-cb1c5efe968d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 NetworkManager[51207]: <info>  [1759408229.5110] manager: (tapd75b3f9f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/236)
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.541 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4789e8a5-e750-4383-a8e6-3e684715a459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.544 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[96895b46-cb5b-48cb-8db4-a443065667ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 NetworkManager[51207]: <info>  [1759408229.5692] device (tapd75b3f9f-50): carrier: link connected
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.573 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[37f31fc9-b242-47f5-8b6b-ca211ae1528d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.591 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb59184e-76c1-4098-acef-2fed3f59db8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd75b3f9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:a9:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622318, 'reachable_time': 26621, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241896, 'error': None, 'target': 'ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.607 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[807213a1-9185-4309-bb6d-25345f361698]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:a918'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 622318, 'tstamp': 622318}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241897, 'error': None, 'target': 'ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.625 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bd5c9b31-f5bc-4329-9068-3beac531dff7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd75b3f9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:a9:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622318, 'reachable_time': 26621, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241900, 'error': None, 'target': 'ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.651 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5f63d9ef-c9b6-4075-afe8-d8c4d4a6f13d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.706 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1623fa32-be9e-40e7-b028-03acfe5518d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.707 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd75b3f9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.707 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.708 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd75b3f9f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466012 NetworkManager[51207]: <info>  [1759408229.7101] manager: (tapd75b3f9f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Oct  2 08:30:29 np0005466012 kernel: tapd75b3f9f-50: entered promiscuous mode
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.715 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd75b3f9f-50, col_values=(('external_ids', {'iface-id': '40827ad6-5bea-4439-8005-f355caa2ee1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:29Z|00521|binding|INFO|Releasing lport 40827ad6-5bea-4439-8005-f355caa2ee1e from this chassis (sb_readonly=0)
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.718 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d75b3f9f-55df-4cd2-9c54-63280bbc840b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d75b3f9f-55df-4cd2-9c54-63280bbc840b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.719 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[131d5606-cb04-4f3b-b713-dd768da310a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.726 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-d75b3f9f-55df-4cd2-9c54-63280bbc840b
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/d75b3f9f-55df-4cd2-9c54-63280bbc840b.pid.haproxy
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID d75b3f9f-55df-4cd2-9c54-63280bbc840b
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:30:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:29.727 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'env', 'PROCESS_TAG=haproxy-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d75b3f9f-55df-4cd2-9c54-63280bbc840b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:30:29 np0005466012 nova_compute[192063]: 2025-10-02 12:30:29.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.101 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408230.1005478, 507e9114-34cf-4091-851b-f85f4a8d9687 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.101 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] VM Started (Lifecycle Event)#033[00m
Oct  2 08:30:30 np0005466012 podman[241936]: 2025-10-02 12:30:30.108894665 +0000 UTC m=+0.066856413 container create e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.129 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.133 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408230.1015499, 507e9114-34cf-4091-851b-f85f4a8d9687 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.133 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:30:30 np0005466012 systemd[1]: Started libpod-conmon-e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e.scope.
Oct  2 08:30:30 np0005466012 podman[241936]: 2025-10-02 12:30:30.067549556 +0000 UTC m=+0.025511334 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:30:30 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:30:30 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e56f04699fb97b74090a5c2aa1575c9c6776a649072d411a8d351b9d7e1c50f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:30:30 np0005466012 podman[241936]: 2025-10-02 12:30:30.188046338 +0000 UTC m=+0.146008096 container init e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.191 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.195 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:30 np0005466012 podman[241936]: 2025-10-02 12:30:30.195307508 +0000 UTC m=+0.153269256 container start e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:30:30 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [NOTICE]   (241954) : New worker (241956) forked
Oct  2 08:30:30 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [NOTICE]   (241954) : Loading success.
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.231 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.575 2 DEBUG nova.network.neutron [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Updated VIF entry in instance network info cache for port 57894cd7-79bd-4e2b-bca9-1da420ad642b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.576 2 DEBUG nova.network.neutron [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Updating instance_info_cache with network_info: [{"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:30 np0005466012 nova_compute[192063]: 2025-10-02 12:30:30.670 2 DEBUG oslo_concurrency.lockutils [req-03160b5b-bf84-43b8-921e-349d905526ef req-9ced43d2-9ba9-4012-aed7-4fed2076d9c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-507e9114-34cf-4091-851b-f85f4a8d9687" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:31 np0005466012 nova_compute[192063]: 2025-10-02 12:30:31.845 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:31 np0005466012 nova_compute[192063]: 2025-10-02 12:30:31.847 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:31 np0005466012 nova_compute[192063]: 2025-10-02 12:30:31.847 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:32 np0005466012 nova_compute[192063]: 2025-10-02 12:30:32.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:33 np0005466012 nova_compute[192063]: 2025-10-02 12:30:33.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:36 np0005466012 nova_compute[192063]: 2025-10-02 12:30:36.820 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:36 np0005466012 nova_compute[192063]: 2025-10-02 12:30:36.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:37 np0005466012 nova_compute[192063]: 2025-10-02 12:30:37.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:38 np0005466012 podman[241965]: 2025-10-02 12:30:38.158450712 +0000 UTC m=+0.077384354 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:30:38 np0005466012 podman[241966]: 2025-10-02 12:30:38.158971977 +0000 UTC m=+0.074422953 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:30:38 np0005466012 nova_compute[192063]: 2025-10-02 12:30:38.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.277 2 DEBUG nova.compute.manager [req-a25b6e56-089c-439f-8867-d9bdbc0776a4 req-730fc485-8aff-4c8f-a69c-f95c9345c0c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.278 2 DEBUG oslo_concurrency.lockutils [req-a25b6e56-089c-439f-8867-d9bdbc0776a4 req-730fc485-8aff-4c8f-a69c-f95c9345c0c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.279 2 DEBUG oslo_concurrency.lockutils [req-a25b6e56-089c-439f-8867-d9bdbc0776a4 req-730fc485-8aff-4c8f-a69c-f95c9345c0c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.279 2 DEBUG oslo_concurrency.lockutils [req-a25b6e56-089c-439f-8867-d9bdbc0776a4 req-730fc485-8aff-4c8f-a69c-f95c9345c0c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.280 2 DEBUG nova.compute.manager [req-a25b6e56-089c-439f-8867-d9bdbc0776a4 req-730fc485-8aff-4c8f-a69c-f95c9345c0c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Processing event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.281 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.285 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408240.2852397, 507e9114-34cf-4091-851b-f85f4a8d9687 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.286 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.289 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.293 2 INFO nova.virt.libvirt.driver [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Instance spawned successfully.#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.293 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.409 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.412 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.413 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.413 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.414 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.414 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.414 2 DEBUG nova.virt.libvirt.driver [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.417 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.536 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.807 2 INFO nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Took 16.32 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.807 2 DEBUG nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.917 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.917 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:30:40 np0005466012 nova_compute[192063]: 2025-10-02 12:30:40.986 2 INFO nova.compute.manager [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Took 16.94 seconds to build instance.#033[00m
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.009 2 DEBUG oslo_concurrency.lockutils [None req-33d97cd1-56c8-4473-b180-7fe204189b4b 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:41 np0005466012 podman[242017]: 2025-10-02 12:30:41.167102804 +0000 UTC m=+0.064611552 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:30:41 np0005466012 podman[242016]: 2025-10-02 12:30:41.203865137 +0000 UTC m=+0.111570817 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.869 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.870 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.871 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.871 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:30:41 np0005466012 nova_compute[192063]: 2025-10-02 12:30:41.971 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.066 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.068 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.130 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.292 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.293 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5600MB free_disk=73.2425537109375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.293 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.294 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.479 2 DEBUG nova.compute.manager [req-5d0c44a4-77f9-4251-b939-3787f197b5a8 req-3f8ffe3d-885f-4a6d-bc22-bb7a591b3041 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.480 2 DEBUG oslo_concurrency.lockutils [req-5d0c44a4-77f9-4251-b939-3787f197b5a8 req-3f8ffe3d-885f-4a6d-bc22-bb7a591b3041 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.480 2 DEBUG oslo_concurrency.lockutils [req-5d0c44a4-77f9-4251-b939-3787f197b5a8 req-3f8ffe3d-885f-4a6d-bc22-bb7a591b3041 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.480 2 DEBUG oslo_concurrency.lockutils [req-5d0c44a4-77f9-4251-b939-3787f197b5a8 req-3f8ffe3d-885f-4a6d-bc22-bb7a591b3041 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.480 2 DEBUG nova.compute.manager [req-5d0c44a4-77f9-4251-b939-3787f197b5a8 req-3f8ffe3d-885f-4a6d-bc22-bb7a591b3041 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] No waiting events found dispatching network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.481 2 WARNING nova.compute.manager [req-5d0c44a4-77f9-4251-b939-3787f197b5a8 req-3f8ffe3d-885f-4a6d-bc22-bb7a591b3041 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received unexpected event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b for instance with vm_state active and task_state None.#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.636 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 507e9114-34cf-4091-851b-f85f4a8d9687 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.642 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.643 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.671 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 278a6b24-7950-4f1b-9c36-8a6030b17e6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.672 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.672 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.684 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.826 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.869 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.869 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.870 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.870 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.870 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.892 2 INFO nova.compute.manager [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Terminating instance#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.920 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.922 2 DEBUG nova.compute.manager [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:30:42 np0005466012 kernel: tap57894cd7-79 (unregistering): left promiscuous mode
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.936 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:42 np0005466012 NetworkManager[51207]: <info>  [1759408242.9394] device (tap57894cd7-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:30:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:42Z|00522|binding|INFO|Releasing lport 57894cd7-79bd-4e2b-bca9-1da420ad642b from this chassis (sb_readonly=0)
Oct  2 08:30:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:42Z|00523|binding|INFO|Setting lport 57894cd7-79bd-4e2b-bca9-1da420ad642b down in Southbound
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:42Z|00524|binding|INFO|Removing iface tap57894cd7-79 ovn-installed in OVS
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.961 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.961 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.961 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:42.962 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:cb:bb 10.100.0.13'], port_security=['fa:16:3e:85:cb:bb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '507e9114-34cf-4091-851b-f85f4a8d9687', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78c62dd51a744172bb1729f604397cc6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d4b8bdb-44ab-4501-ac8b-0476efc460ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfd38043-23cb-47f2-8769-74841907507c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=57894cd7-79bd-4e2b-bca9-1da420ad642b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:42.964 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 57894cd7-79bd-4e2b-bca9-1da420ad642b in datapath d75b3f9f-55df-4cd2-9c54-63280bbc840b unbound from our chassis#033[00m
Oct  2 08:30:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:42.966 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d75b3f9f-55df-4cd2-9c54-63280bbc840b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:30:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:42.968 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb6b5fd-0cfc-4e9d-b25a-5ea7b63b3533]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:42 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:42.968 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b namespace which is not needed anymore#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.967 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.967 2 INFO nova.compute.claims [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:30:42 np0005466012 nova_compute[192063]: 2025-10-02 12:30:42.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466012 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000085.scope: Deactivated successfully.
Oct  2 08:30:43 np0005466012 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000085.scope: Consumed 3.288s CPU time.
Oct  2 08:30:43 np0005466012 systemd-machined[152114]: Machine qemu-62-instance-00000085 terminated.
Oct  2 08:30:43 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [NOTICE]   (241954) : haproxy version is 2.8.14-c23fe91
Oct  2 08:30:43 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [NOTICE]   (241954) : path to executable is /usr/sbin/haproxy
Oct  2 08:30:43 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [WARNING]  (241954) : Exiting Master process...
Oct  2 08:30:43 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [WARNING]  (241954) : Exiting Master process...
Oct  2 08:30:43 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [ALERT]    (241954) : Current worker (241956) exited with code 143 (Terminated)
Oct  2 08:30:43 np0005466012 neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b[241950]: [WARNING]  (241954) : All workers exited. Exiting... (0)
Oct  2 08:30:43 np0005466012 systemd[1]: libpod-e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e.scope: Deactivated successfully.
Oct  2 08:30:43 np0005466012 podman[242086]: 2025-10-02 12:30:43.103655299 +0000 UTC m=+0.053204187 container died e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.181 2 INFO nova.virt.libvirt.driver [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Instance destroyed successfully.#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.182 2 DEBUG nova.objects.instance [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lazy-loading 'resources' on Instance uuid 507e9114-34cf-4091-851b-f85f4a8d9687 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.304 2 DEBUG nova.virt.libvirt.vif [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:30:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-667443961',display_name='tempest-ServerGroupTestJSON-server-667443961',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-667443961',id=133,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78c62dd51a744172bb1729f604397cc6',ramdisk_id='',reservation_id='r-krr8luyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-1947845073',owner_user_name='tempest-ServerGroupTestJSON-1947845073-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:30:40Z,user_data=None,user_id='0d6270103ff9452cb8caedb8f707fde1',uuid=507e9114-34cf-4091-851b-f85f4a8d9687,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.304 2 DEBUG nova.network.os_vif_util [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Converting VIF {"id": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "address": "fa:16:3e:85:cb:bb", "network": {"id": "d75b3f9f-55df-4cd2-9c54-63280bbc840b", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1374294592-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78c62dd51a744172bb1729f604397cc6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57894cd7-79", "ovs_interfaceid": "57894cd7-79bd-4e2b-bca9-1da420ad642b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.305 2 DEBUG nova.network.os_vif_util [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.305 2 DEBUG os_vif [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.308 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57894cd7-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.312 2 INFO os_vif [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:cb:bb,bridge_name='br-int',has_traffic_filtering=True,id=57894cd7-79bd-4e2b-bca9-1da420ad642b,network=Network(d75b3f9f-55df-4cd2-9c54-63280bbc840b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57894cd7-79')#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.313 2 INFO nova.virt.libvirt.driver [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Deleting instance files /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687_del#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.313 2 INFO nova.virt.libvirt.driver [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Deletion of /var/lib/nova/instances/507e9114-34cf-4091-851b-f85f4a8d9687_del complete#033[00m
Oct  2 08:30:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.376 2 DEBUG nova.compute.provider_tree [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:43 np0005466012 systemd[1]: var-lib-containers-storage-overlay-3e56f04699fb97b74090a5c2aa1575c9c6776a649072d411a8d351b9d7e1c50f-merged.mount: Deactivated successfully.
Oct  2 08:30:43 np0005466012 podman[242086]: 2025-10-02 12:30:43.386101366 +0000 UTC m=+0.335650284 container cleanup e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.391 2 INFO nova.compute.manager [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.392 2 DEBUG oslo.service.loopingcall [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.392 2 DEBUG nova.compute.manager [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.392 2 DEBUG nova.network.neutron [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.397 2 DEBUG nova.scheduler.client.report [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.422 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.423 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:30:43 np0005466012 podman[242130]: 2025-10-02 12:30:43.455871799 +0000 UTC m=+0.049337011 container remove e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.461 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8a4d82-cf27-44e9-b9c7-abdf6e9f11ca]: (4, ('Thu Oct  2 12:30:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b (e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e)\ne0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e\nThu Oct  2 12:30:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b (e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e)\ne0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.463 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6a11df-e18d-4e82-98cd-f91f561d1918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.464 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd75b3f9f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466012 kernel: tapd75b3f9f-50: left promiscuous mode
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.481 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[631c87fb-d020-4e11-8f8e-d82af3c16745]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.485 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.486 2 DEBUG nova.network.neutron [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.511 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[326dfd62-f55e-4320-9c6b-5fb81c4824a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.512 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3bbc56-4879-4a4c-bad4-7571ad168ee2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.525 2 INFO nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.527 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d16fb85d-7c31-491b-b5e1-8922ef3a64b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622311, 'reachable_time': 18643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242145, 'error': None, 'target': 'ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.531 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d75b3f9f-55df-4cd2-9c54-63280bbc840b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:30:43 np0005466012 systemd[1]: run-netns-ovnmeta\x2dd75b3f9f\x2d55df\x2d4cd2\x2d9c54\x2d63280bbc840b.mount: Deactivated successfully.
Oct  2 08:30:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:43.531 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[49e628b5-bc6c-45c0-96a9-d7da89d43dec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.541 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:30:43 np0005466012 systemd[1]: libpod-conmon-e0d5ab8b067230be592cc830fc35a61c02ea424d33ebb85ac3371dfe44628c3e.scope: Deactivated successfully.
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.819 2 DEBUG nova.policy [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.925 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.927 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.927 2 INFO nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Creating image(s)#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.928 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "/var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.928 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.929 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.945 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.963 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.963 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:30:43 np0005466012 nova_compute[192063]: 2025-10-02 12:30:43.963 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.002 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.002 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.002 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.003 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.003 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.004 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.005 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.017 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.073 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.074 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.109 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.110 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.110 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.167 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.168 2 DEBUG nova.virt.disk.api [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Checking if we can resize image /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.168 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.223 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.225 2 DEBUG nova.virt.disk.api [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Cannot resize image /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.225 2 DEBUG nova.objects.instance [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'migration_context' on Instance uuid 278a6b24-7950-4f1b-9c36-8a6030b17e6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.240 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.241 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Ensure instance console log exists: /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.241 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.241 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.242 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.711 2 DEBUG nova.compute.manager [req-31f7c178-a58b-4941-b372-2c719f3c0164 req-609c376a-e2ba-450b-9307-df0eb1ccadf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-vif-unplugged-57894cd7-79bd-4e2b-bca9-1da420ad642b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.712 2 DEBUG oslo_concurrency.lockutils [req-31f7c178-a58b-4941-b372-2c719f3c0164 req-609c376a-e2ba-450b-9307-df0eb1ccadf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.712 2 DEBUG oslo_concurrency.lockutils [req-31f7c178-a58b-4941-b372-2c719f3c0164 req-609c376a-e2ba-450b-9307-df0eb1ccadf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.712 2 DEBUG oslo_concurrency.lockutils [req-31f7c178-a58b-4941-b372-2c719f3c0164 req-609c376a-e2ba-450b-9307-df0eb1ccadf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.712 2 DEBUG nova.compute.manager [req-31f7c178-a58b-4941-b372-2c719f3c0164 req-609c376a-e2ba-450b-9307-df0eb1ccadf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] No waiting events found dispatching network-vif-unplugged-57894cd7-79bd-4e2b-bca9-1da420ad642b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.713 2 DEBUG nova.compute.manager [req-31f7c178-a58b-4941-b372-2c719f3c0164 req-609c376a-e2ba-450b-9307-df0eb1ccadf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-vif-unplugged-57894cd7-79bd-4e2b-bca9-1da420ad642b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.964 2 DEBUG nova.network.neutron [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:44 np0005466012 nova_compute[192063]: 2025-10-02 12:30:44.994 2 INFO nova.compute.manager [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Took 1.60 seconds to deallocate network for instance.#033[00m
Oct  2 08:30:45 np0005466012 nova_compute[192063]: 2025-10-02 12:30:45.156 2 DEBUG nova.compute.manager [req-3380559d-130a-4bbc-b8ae-24e138e23f44 req-f3e30c5a-a7da-4681-bf00-c7685737eca3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-vif-deleted-57894cd7-79bd-4e2b-bca9-1da420ad642b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:45 np0005466012 nova_compute[192063]: 2025-10-02 12:30:45.667 2 DEBUG nova.network.neutron [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Successfully created port: c8d25dbc-b58e-4ab0-9b54-141b6fdd352c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:30:45 np0005466012 nova_compute[192063]: 2025-10-02 12:30:45.819 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:45 np0005466012 nova_compute[192063]: 2025-10-02 12:30:45.820 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:45 np0005466012 nova_compute[192063]: 2025-10-02 12:30:45.901 2 DEBUG nova.compute.provider_tree [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:30:45 np0005466012 nova_compute[192063]: 2025-10-02 12:30:45.990 2 DEBUG nova.scheduler.client.report [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.046 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.108 2 INFO nova.scheduler.client.report [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Deleted allocations for instance 507e9114-34cf-4091-851b-f85f4a8d9687#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.745 2 DEBUG oslo_concurrency.lockutils [None req-cc590af3-10b4-4614-8cdd-e2f408fd3fe6 0d6270103ff9452cb8caedb8f707fde1 78c62dd51a744172bb1729f604397cc6 - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.867 2 DEBUG nova.network.neutron [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Successfully updated port: c8d25dbc-b58e-4ab0-9b54-141b6fdd352c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.969 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.969 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:46 np0005466012 nova_compute[192063]: 2025-10-02 12:30:46.970 2 DEBUG nova.network.neutron [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.052 2 DEBUG nova.compute.manager [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-changed-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.052 2 DEBUG nova.compute.manager [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Refreshing instance network info cache due to event network-changed-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.052 2 DEBUG oslo_concurrency.lockutils [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.053 2 DEBUG nova.compute.manager [req-d157a501-ed81-4d4c-912b-7e0514b6e5aa req-3a9f7426-3765-4c29-91c1-e3fe176159a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.053 2 DEBUG oslo_concurrency.lockutils [req-d157a501-ed81-4d4c-912b-7e0514b6e5aa req-3a9f7426-3765-4c29-91c1-e3fe176159a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.054 2 DEBUG oslo_concurrency.lockutils [req-d157a501-ed81-4d4c-912b-7e0514b6e5aa req-3a9f7426-3765-4c29-91c1-e3fe176159a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.054 2 DEBUG oslo_concurrency.lockutils [req-d157a501-ed81-4d4c-912b-7e0514b6e5aa req-3a9f7426-3765-4c29-91c1-e3fe176159a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "507e9114-34cf-4091-851b-f85f4a8d9687-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.054 2 DEBUG nova.compute.manager [req-d157a501-ed81-4d4c-912b-7e0514b6e5aa req-3a9f7426-3765-4c29-91c1-e3fe176159a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] No waiting events found dispatching network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.055 2 WARNING nova.compute.manager [req-d157a501-ed81-4d4c-912b-7e0514b6e5aa req-3a9f7426-3765-4c29-91c1-e3fe176159a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Received unexpected event network-vif-plugged-57894cd7-79bd-4e2b-bca9-1da420ad642b for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.393 2 DEBUG nova.network.neutron [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:30:47 np0005466012 nova_compute[192063]: 2025-10-02 12:30:47.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:48 np0005466012 nova_compute[192063]: 2025-10-02 12:30:48.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.834 2 DEBUG nova.network.neutron [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updating instance_info_cache with network_info: [{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.883 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.884 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Instance network_info: |[{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.884 2 DEBUG oslo_concurrency.lockutils [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.884 2 DEBUG nova.network.neutron [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Refreshing network info cache for port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.887 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Start _get_guest_xml network_info=[{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.892 2 WARNING nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.955 2 DEBUG nova.virt.libvirt.host [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.956 2 DEBUG nova.virt.libvirt.host [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.960 2 DEBUG nova.virt.libvirt.host [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.961 2 DEBUG nova.virt.libvirt.host [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.962 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.962 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.963 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.963 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.963 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.963 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.964 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.964 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.964 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.964 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.964 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.965 2 DEBUG nova.virt.hardware [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.968 2 DEBUG nova.virt.libvirt.vif [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1991691227',display_name='tempest-TestNetworkBasicOps-server-1991691227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1991691227',id=136,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNC/wrQNlb6KlN5Z6XL9qrRcQ/3+2UhlzIST7vbT+IX6q3g/aHALXbcl4WHv8B5x3DO8YryOAwWxfCs4B/hC3zlm+AKdkTIKvBvev1JrZYrxqMSTvoM5lzVVNSiunRoLZw==',key_name='tempest-TestNetworkBasicOps-862184460',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-gm2ceg0s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:43Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=278a6b24-7950-4f1b-9c36-8a6030b17e6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.968 2 DEBUG nova.network.os_vif_util [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.969 2 DEBUG nova.network.os_vif_util [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:49 np0005466012 nova_compute[192063]: 2025-10-02 12:30:49.969 2 DEBUG nova.objects.instance [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 278a6b24-7950-4f1b-9c36-8a6030b17e6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.001 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <uuid>278a6b24-7950-4f1b-9c36-8a6030b17e6d</uuid>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <name>instance-00000088</name>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestNetworkBasicOps-server-1991691227</nova:name>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:30:49</nova:creationTime>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        <nova:port uuid="c8d25dbc-b58e-4ab0-9b54-141b6fdd352c">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <entry name="serial">278a6b24-7950-4f1b-9c36-8a6030b17e6d</entry>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <entry name="uuid">278a6b24-7950-4f1b-9c36-8a6030b17e6d</entry>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.config"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:28:1d:9e"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <target dev="tapc8d25dbc-b5"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/console.log" append="off"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:30:50 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:30:50 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:30:50 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:30:50 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.003 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Preparing to wait for external event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.004 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.004 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.005 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.006 2 DEBUG nova.virt.libvirt.vif [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:30:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1991691227',display_name='tempest-TestNetworkBasicOps-server-1991691227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1991691227',id=136,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNC/wrQNlb6KlN5Z6XL9qrRcQ/3+2UhlzIST7vbT+IX6q3g/aHALXbcl4WHv8B5x3DO8YryOAwWxfCs4B/hC3zlm+AKdkTIKvBvev1JrZYrxqMSTvoM5lzVVNSiunRoLZw==',key_name='tempest-TestNetworkBasicOps-862184460',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-gm2ceg0s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:30:43Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=278a6b24-7950-4f1b-9c36-8a6030b17e6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.006 2 DEBUG nova.network.os_vif_util [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.007 2 DEBUG nova.network.os_vif_util [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.007 2 DEBUG os_vif [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.009 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.009 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.013 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8d25dbc-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.013 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8d25dbc-b5, col_values=(('external_ids', {'iface-id': 'c8d25dbc-b58e-4ab0-9b54-141b6fdd352c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:1d:9e', 'vm-uuid': '278a6b24-7950-4f1b-9c36-8a6030b17e6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005466012 NetworkManager[51207]: <info>  [1759408250.0167] manager: (tapc8d25dbc-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.024 2 INFO os_vif [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5')#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.090 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.092 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.092 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:28:1d:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.093 2 INFO nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Using config drive#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.965 2 INFO nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Creating config drive at /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.config#033[00m
Oct  2 08:30:50 np0005466012 nova_compute[192063]: 2025-10-02 12:30:50.974 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqeg9hl6w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.106 2 DEBUG oslo_concurrency.processutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqeg9hl6w" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:30:51 np0005466012 podman[242171]: 2025-10-02 12:30:51.149975074 +0000 UTC m=+0.069024273 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:30:51 np0005466012 podman[242170]: 2025-10-02 12:30:51.150717175 +0000 UTC m=+0.069814495 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  2 08:30:51 np0005466012 kernel: tapc8d25dbc-b5: entered promiscuous mode
Oct  2 08:30:51 np0005466012 NetworkManager[51207]: <info>  [1759408251.1694] manager: (tapc8d25dbc-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:51Z|00525|binding|INFO|Claiming lport c8d25dbc-b58e-4ab0-9b54-141b6fdd352c for this chassis.
Oct  2 08:30:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:51Z|00526|binding|INFO|c8d25dbc-b58e-4ab0-9b54-141b6fdd352c: Claiming fa:16:3e:28:1d:9e 10.100.0.7
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.186 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:1d:9e 10.100.0.7'], port_security=['fa:16:3e:28:1d:9e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59ce980f-674e-478f-8a88-32561beb276a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c9faaa8-b796-47ef-a5cc-9fcd05d197d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30835f85-fac7-4f1d-86c6-e93cfff628ff, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.188 103246 INFO neutron.agent.ovn.metadata.agent [-] Port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c in datapath 59ce980f-674e-478f-8a88-32561beb276a bound to our chassis#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.190 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 59ce980f-674e-478f-8a88-32561beb276a#033[00m
Oct  2 08:30:51 np0005466012 systemd-udevd[242219]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.203 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08345cdc-4418-4af5-b268-150cd8da2d87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.204 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap59ce980f-61 in ovnmeta-59ce980f-674e-478f-8a88-32561beb276a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.206 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap59ce980f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.206 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[35c6856d-c08c-4051-a397-563a68e64c63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.207 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3822c994-b0ea-4c88-bd31-8af88ae96c4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 NetworkManager[51207]: <info>  [1759408251.2148] device (tapc8d25dbc-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:30:51 np0005466012 NetworkManager[51207]: <info>  [1759408251.2156] device (tapc8d25dbc-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.220 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[102658b1-e294-4f01-ad2f-50378b22b2ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 systemd-machined[152114]: New machine qemu-63-instance-00000088.
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.239 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[abe3c6f8-162a-4059-961b-5633f27e6a44]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:51Z|00527|binding|INFO|Setting lport c8d25dbc-b58e-4ab0-9b54-141b6fdd352c ovn-installed in OVS
Oct  2 08:30:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:51Z|00528|binding|INFO|Setting lport c8d25dbc-b58e-4ab0-9b54-141b6fdd352c up in Southbound
Oct  2 08:30:51 np0005466012 systemd[1]: Started Virtual Machine qemu-63-instance-00000088.
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.276 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4fadc475-4299-4049-abd6-ab4da9a92d99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 NetworkManager[51207]: <info>  [1759408251.2822] manager: (tap59ce980f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/240)
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.282 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[60d89d2d-16db-47b5-9a12-4511b4a6a2a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.314 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[78154cf9-27a8-4cde-9d67-a92f6b4d84d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.317 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f41693-dadb-4233-9b0a-4618068fdf5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 NetworkManager[51207]: <info>  [1759408251.3407] device (tap59ce980f-60): carrier: link connected
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.346 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[96d32d42-1b6a-43f8-b1a7-5ed12de99d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.362 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[917bf63d-26d8-4fd0-a4d1-0e85444417e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59ce980f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:33:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624495, 'reachable_time': 30771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242255, 'error': None, 'target': 'ovnmeta-59ce980f-674e-478f-8a88-32561beb276a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.376 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9b09a4-70a4-41cb-b615-9994ebc006a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe95:33d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624495, 'tstamp': 624495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242256, 'error': None, 'target': 'ovnmeta-59ce980f-674e-478f-8a88-32561beb276a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.391 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[203428b9-a2f7-40f1-b400-92105c89de93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59ce980f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:33:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624495, 'reachable_time': 30771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242257, 'error': None, 'target': 'ovnmeta-59ce980f-674e-478f-8a88-32561beb276a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.421 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[514f5a18-997c-49ec-83e1-0e1e8a20d7d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.484 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a0df1207-8961-4444-9144-3719c20c1e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.487 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59ce980f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.488 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.488 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59ce980f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005466012 kernel: tap59ce980f-60: entered promiscuous mode
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.495 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap59ce980f-60, col_values=(('external_ids', {'iface-id': '42ed37b7-bd98-49bd-999b-805cbf42cd4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:30:51 np0005466012 NetworkManager[51207]: <info>  [1759408251.4967] manager: (tap59ce980f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Oct  2 08:30:51 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:51Z|00529|binding|INFO|Releasing lport 42ed37b7-bd98-49bd-999b-805cbf42cd4d from this chassis (sb_readonly=0)
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.499 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/59ce980f-674e-478f-8a88-32561beb276a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/59ce980f-674e-478f-8a88-32561beb276a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.502 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[89482479-7459-4f2d-8081-ffb2611ab056]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.503 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-59ce980f-674e-478f-8a88-32561beb276a
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/59ce980f-674e-478f-8a88-32561beb276a.pid.haproxy
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 59ce980f-674e-478f-8a88-32561beb276a
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:30:51 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:30:51.507 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-59ce980f-674e-478f-8a88-32561beb276a', 'env', 'PROCESS_TAG=haproxy-59ce980f-674e-478f-8a88-32561beb276a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/59ce980f-674e-478f-8a88-32561beb276a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:30:51 np0005466012 nova_compute[192063]: 2025-10-02 12:30:51.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:51 np0005466012 podman[242295]: 2025-10-02 12:30:51.942577674 +0000 UTC m=+0.104551103 container create 36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:30:51 np0005466012 podman[242295]: 2025-10-02 12:30:51.861454668 +0000 UTC m=+0.023428117 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:30:52 np0005466012 systemd[1]: Started libpod-conmon-36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370.scope.
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.016 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408252.0162861, 278a6b24-7950-4f1b-9c36-8a6030b17e6d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.018 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.021 2 DEBUG nova.compute.manager [req-c6000ad9-bf67-4b58-a975-3c81e98289f4 req-fc9f820e-e994-4369-9ea5-de7f7671337e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.022 2 DEBUG oslo_concurrency.lockutils [req-c6000ad9-bf67-4b58-a975-3c81e98289f4 req-fc9f820e-e994-4369-9ea5-de7f7671337e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.022 2 DEBUG oslo_concurrency.lockutils [req-c6000ad9-bf67-4b58-a975-3c81e98289f4 req-fc9f820e-e994-4369-9ea5-de7f7671337e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.023 2 DEBUG oslo_concurrency.lockutils [req-c6000ad9-bf67-4b58-a975-3c81e98289f4 req-fc9f820e-e994-4369-9ea5-de7f7671337e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.023 2 DEBUG nova.compute.manager [req-c6000ad9-bf67-4b58-a975-3c81e98289f4 req-fc9f820e-e994-4369-9ea5-de7f7671337e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Processing event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.023 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.029 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.032 2 INFO nova.virt.libvirt.driver [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Instance spawned successfully.#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.033 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:30:52 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.040 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.042 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:52 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09752d3289c5233dc5391dbfab852872fe342ce61e842169037da274f5c78ed2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:30:52 np0005466012 podman[242295]: 2025-10-02 12:30:52.055007694 +0000 UTC m=+0.216981143 container init 36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.060 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:52 np0005466012 podman[242295]: 2025-10-02 12:30:52.061152094 +0000 UTC m=+0.223125523 container start 36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.061 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.062 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.063 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.064 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.064 2 DEBUG nova.virt.libvirt.driver [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.076 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.077 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408252.0164044, 278a6b24-7950-4f1b-9c36-8a6030b17e6d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.077 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:30:52 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [NOTICE]   (242314) : New worker (242316) forked
Oct  2 08:30:52 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [NOTICE]   (242314) : Loading success.
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.123 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.127 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408252.026543, 278a6b24-7950-4f1b-9c36-8a6030b17e6d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.128 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.151 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.155 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.162 2 INFO nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Took 8.24 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.163 2 DEBUG nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.181 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.283 2 INFO nova.compute.manager [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Took 9.51 seconds to build instance.#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.309 2 DEBUG oslo_concurrency.lockutils [None req-fbb1f405-810b-4f28-a602-4fd6850fafcf a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:52 np0005466012 nova_compute[192063]: 2025-10-02 12:30:52.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:53 np0005466012 nova_compute[192063]: 2025-10-02 12:30:53.054 2 DEBUG nova.network.neutron [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updated VIF entry in instance network info cache for port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:53 np0005466012 nova_compute[192063]: 2025-10-02 12:30:53.055 2 DEBUG nova.network.neutron [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updating instance_info_cache with network_info: [{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:53 np0005466012 nova_compute[192063]: 2025-10-02 12:30:53.072 2 DEBUG oslo_concurrency.lockutils [req-5275a9f5-bbde-4919-8c11-91c178cc42d4 req-41e35e42-58ea-4f33-8f76-79c86fc01656 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:30:54 np0005466012 podman[242326]: 2025-10-02 12:30:54.144682081 +0000 UTC m=+0.055436579 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:30:54 np0005466012 nova_compute[192063]: 2025-10-02 12:30:54.152 2 DEBUG nova.compute.manager [req-bd069321-7f4a-4508-9264-0d0889c32198 req-9e107d14-8164-498a-86c8-8c0016ee5c68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:54 np0005466012 nova_compute[192063]: 2025-10-02 12:30:54.153 2 DEBUG oslo_concurrency.lockutils [req-bd069321-7f4a-4508-9264-0d0889c32198 req-9e107d14-8164-498a-86c8-8c0016ee5c68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:30:54 np0005466012 nova_compute[192063]: 2025-10-02 12:30:54.153 2 DEBUG oslo_concurrency.lockutils [req-bd069321-7f4a-4508-9264-0d0889c32198 req-9e107d14-8164-498a-86c8-8c0016ee5c68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:30:54 np0005466012 nova_compute[192063]: 2025-10-02 12:30:54.153 2 DEBUG oslo_concurrency.lockutils [req-bd069321-7f4a-4508-9264-0d0889c32198 req-9e107d14-8164-498a-86c8-8c0016ee5c68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:30:54 np0005466012 nova_compute[192063]: 2025-10-02 12:30:54.153 2 DEBUG nova.compute.manager [req-bd069321-7f4a-4508-9264-0d0889c32198 req-9e107d14-8164-498a-86c8-8c0016ee5c68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] No waiting events found dispatching network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:30:54 np0005466012 nova_compute[192063]: 2025-10-02 12:30:54.153 2 WARNING nova.compute.manager [req-bd069321-7f4a-4508-9264-0d0889c32198 req-9e107d14-8164-498a-86c8-8c0016ee5c68 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received unexpected event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c for instance with vm_state active and task_state None.#033[00m
Oct  2 08:30:54 np0005466012 podman[242325]: 2025-10-02 12:30:54.165586747 +0000 UTC m=+0.083012029 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:30:55 np0005466012 nova_compute[192063]: 2025-10-02 12:30:55.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:55Z|00530|binding|INFO|Releasing lport 42ed37b7-bd98-49bd-999b-805cbf42cd4d from this chassis (sb_readonly=0)
Oct  2 08:30:55 np0005466012 nova_compute[192063]: 2025-10-02 12:30:55.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:57 np0005466012 NetworkManager[51207]: <info>  [1759408257.0432] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Oct  2 08:30:57 np0005466012 NetworkManager[51207]: <info>  [1759408257.0441] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:30:57Z|00531|binding|INFO|Releasing lport 42ed37b7-bd98-49bd-999b-805cbf42cd4d from this chassis (sb_readonly=0)
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.447 2 DEBUG nova.compute.manager [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-changed-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.448 2 DEBUG nova.compute.manager [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Refreshing instance network info cache due to event network-changed-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.448 2 DEBUG oslo_concurrency.lockutils [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.449 2 DEBUG oslo_concurrency.lockutils [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:30:57 np0005466012 nova_compute[192063]: 2025-10-02 12:30:57.449 2 DEBUG nova.network.neutron [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Refreshing network info cache for port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:30:58 np0005466012 nova_compute[192063]: 2025-10-02 12:30:58.180 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408243.1788766, 507e9114-34cf-4091-851b-f85f4a8d9687 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:30:58 np0005466012 nova_compute[192063]: 2025-10-02 12:30:58.180 2 INFO nova.compute.manager [-] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:30:58 np0005466012 nova_compute[192063]: 2025-10-02 12:30:58.293 2 DEBUG nova.compute.manager [None req-2be8a540-a676-4b3f-a932-a810ac226018 - - - - - -] [instance: 507e9114-34cf-4091-851b-f85f4a8d9687] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:30:58 np0005466012 nova_compute[192063]: 2025-10-02 12:30:58.918 2 DEBUG nova.network.neutron [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updated VIF entry in instance network info cache for port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:30:58 np0005466012 nova_compute[192063]: 2025-10-02 12:30:58.919 2 DEBUG nova.network.neutron [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updating instance_info_cache with network_info: [{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:30:59 np0005466012 nova_compute[192063]: 2025-10-02 12:30:59.025 2 DEBUG oslo_concurrency.lockutils [req-d769aaf1-269a-4e1c-bf9a-a1646687b45b req-45f0176e-12ac-47ef-ae49-7d93c00df74d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:00 np0005466012 nova_compute[192063]: 2025-10-02 12:31:00.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:02.145 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:02.146 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:02.147 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:02 np0005466012 nova_compute[192063]: 2025-10-02 12:31:02.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:03 np0005466012 nova_compute[192063]: 2025-10-02 12:31:03.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:04.072 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:04 np0005466012 nova_compute[192063]: 2025-10-02 12:31:04.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:04.073 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:31:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:04.074 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:04Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:28:1d:9e 10.100.0.7
Oct  2 08:31:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:04Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:28:1d:9e 10.100.0.7
Oct  2 08:31:05 np0005466012 nova_compute[192063]: 2025-10-02 12:31:05.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:06 np0005466012 nova_compute[192063]: 2025-10-02 12:31:06.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:07 np0005466012 nova_compute[192063]: 2025-10-02 12:31:07.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:09 np0005466012 podman[242381]: 2025-10-02 12:31:09.158722265 +0000 UTC m=+0.054834164 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:31:09 np0005466012 podman[242382]: 2025-10-02 12:31:09.188812861 +0000 UTC m=+0.085449024 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Oct  2 08:31:10 np0005466012 nova_compute[192063]: 2025-10-02 12:31:10.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:12 np0005466012 nova_compute[192063]: 2025-10-02 12:31:12.000 2 INFO nova.compute.manager [None req-b0f784ee-ba44-4b1c-acad-edd3efb1f5e0 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Get console output#033[00m
Oct  2 08:31:12 np0005466012 nova_compute[192063]: 2025-10-02 12:31:12.008 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:31:12 np0005466012 podman[242429]: 2025-10-02 12:31:12.174474965 +0000 UTC m=+0.075208578 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm)
Oct  2 08:31:12 np0005466012 podman[242430]: 2025-10-02 12:31:12.200375255 +0000 UTC m=+0.086160484 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:12 np0005466012 nova_compute[192063]: 2025-10-02 12:31:12.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:15 np0005466012 nova_compute[192063]: 2025-10-02 12:31:15.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.926 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'name': 'tempest-TestNetworkBasicOps-server-1991691227', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000088', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e2a4899168a47618e377cb3ac85ddd2', 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'hostId': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.931 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 278a6b24-7950-4f1b-9c36-8a6030b17e6d / tapc8d25dbc-b5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.931 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '969efde3-869a-46f8-9568-7c91fcb37c6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:16.927111', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b11e6db4-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '5aff0f37c07403d9018c6e50db1da6a5fa3a509556944d6f9f63e21bcc37a8b8'}]}, 'timestamp': '2025-10-02 12:31:16.931902', '_unique_id': '088f888c36f9485ca8d88c7e55cec4a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.932 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.949 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.950 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd01677c3-e841-416d-987f-bac3e233d268', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.933857', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1213b98-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '3cd3f170c91509f4a8782ceb4d2d2ca8ed645c30f7a79eb10ab6a6b6e7de74cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.933857', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1214e8a-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '06f8aebffd9d91d0b2bac3a20e91534b1574a85d2bdd000612cc517d18310c73'}]}, 'timestamp': '2025-10-02 12:31:16.950872', '_unique_id': '740faf07c6224b7f9f6e7dfcf04c9c8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.952 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.953 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46183228-3957-4d1a-b3b5-532b95d81175', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:16.953451', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b121c98c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': 'a39931185db97dcd5e1f52e95c10f0bdc8df4ce01d282532ac1c899c7d93c2a7'}]}, 'timestamp': '2025-10-02 12:31:16.953980', '_unique_id': 'b5fad8cfd48e473aada5a256e93e7844'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.955 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.956 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95b40f53-8384-446f-9e86-764eef9d9b21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:16.956342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b1223868-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': 'a8bc7c361eb7381e26765b9a5f4c12590caf467c33adea2c05476bfa68caf016'}]}, 'timestamp': '2025-10-02 12:31:16.956854', '_unique_id': 'ae5596d2be264e8a9496cc30e1daa596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.957 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.959 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.968 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.968 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f053eb4-0c39-4ec6-9acb-9cd32ed0e427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.959163', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b124135e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.635555494, 'message_signature': '8f5d0320fca893e0189ccfcd19b6afe3287b3b1fe31ac4d5155edd2af3991e1a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.959163', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b124254c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.635555494, 'message_signature': 'dd0b58bde611eac5fdb70b1de986d672c86e278747513172185c791384c21842'}]}, 'timestamp': '2025-10-02 12:31:16.969403', '_unique_id': '85173f9557c8410e9df1321481c45ed5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.970 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.971 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.971 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.outgoing.packets volume: 106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54b6a5fc-453c-4a93-acbc-1a3c1f4b4d0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 106, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:16.971931', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b12499aa-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '33fe5c2570fb88c3810cb35942115691c8421bb6cb7dae40cd567888362d1a0b'}]}, 'timestamp': '2025-10-02 12:31:16.972412', '_unique_id': '716282f605ff4468b9da2b482346c9c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.973 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.974 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.975 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b29fa3a-5e32-4714-8ec7-6605ad728c27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.974792', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1250926-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '9096457985461a14d245b571af559823b637868e77f07ccb82932f1e23df1d00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.974792', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12519b6-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '06539749a405e065f37b0c6e0d02623bc7f1dbf838bbfb562b06d9de0aab668c'}]}, 'timestamp': '2025-10-02 12:31:16.975654', '_unique_id': '6816c375023b4152912dfdacc637ebc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.976 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.977 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.978 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.write.requests volume: 282 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.978 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2094855a-8e2b-4963-8ca7-55d461accbac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 282, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.978032', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1258784-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '251caf1383454e705d038122264334fa4cc8ad964ff5f52768eb495f1b80566a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.978032', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12599ea-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': 'cdd974c004920f2c045a9142f2d2b83657d970d2ecf02f2be7abd33cd3586b6d'}]}, 'timestamp': '2025-10-02 12:31:16.978941', '_unique_id': '47cc311ca81c49398798fcc47470d86d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.981 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.981 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>]
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.982 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.982 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>]
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.982 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.read.latency volume: 539788416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.983 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.read.latency volume: 38778550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a1a14b8-cef9-4cb8-a727-302a15ba0ad4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 539788416, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.982878', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1264520-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': 'c733edb0d72ae53e661ea8200114b99d553bd684cdd705f128ea90b0ab2986cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38778550, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.982878', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12655b0-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '279aaef95abe9421d4206b21002c20610e9348d79cc2f0f7b27a0ee6fe9a1671'}]}, 'timestamp': '2025-10-02 12:31:16.983780', '_unique_id': 'f3981153fed743318df38e4cb6e962c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.986 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.incoming.packets volume: 105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '894add30-e8c0-49da-baa9-5e02fd0a2e36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 105, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:16.986131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b126c46e-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '5558c5d000ed05a2096338c4a7a861134034dc8b149dd7857491653dd23383f2'}]}, 'timestamp': '2025-10-02 12:31:16.986651', '_unique_id': '9ec543faa3fd4769a9b463d403545187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.987 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.989 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.989 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>]
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.990 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.incoming.bytes volume: 19390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65a3630b-67ff-43c7-b942-8f3739387447', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19390, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:16.989996', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b1275c6c-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '34d9f345111f02009cec43bd3d6f7294f9bcf3fa1ac950ad1330bbced115c489'}]}, 'timestamp': '2025-10-02 12:31:16.990510', '_unique_id': '975e25dd457c4ff68d0267c28a724bf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.993 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.993 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b78a2c4c-f12b-4db5-a218-b49366ce9967', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.993081', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b127d3ea-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.635555494, 'message_signature': 'b24d5950ce535ae17455c04160de7d0a0394c14eeea4ce9b63c957b504ffdf78'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.993081', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b127e9c0-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.635555494, 'message_signature': '8aa21072959eaaf3442702064de7717fbbb3ccbbbc8ab3ec150cf0ae2faae960'}]}, 'timestamp': '2025-10-02 12:31:16.994093', '_unique_id': '6680ea935e3b4c57b83ec20203d7fc35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.996 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.997 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58bf0532-931b-4c79-8b6d-e0ac2c42fc51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:16.996513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1285be4-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.635555494, 'message_signature': 'd40a96e0e269b6e39bc19b6a930a21fff281a04bc4599074f42bcaaa557ad9f9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:16.996513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1286d46-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.635555494, 'message_signature': '9eb09f547ff30a9e4808417733d8efeecc8f0eed1f51d253ffd68867903fa8f6'}]}, 'timestamp': '2025-10-02 12:31:16.997457', '_unique_id': 'b47e608816e14a2b9fae79d5ac699279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.998 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:16.999 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.029 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/cpu volume: 12020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d7ccfd1-7e36-412d-8be5-152279cc4c0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12020000000, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'timestamp': '2025-10-02T12:31:17.000151', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b12d75f2-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.705789674, 'message_signature': 'f3144ba5e945041945c9d2a9eb7fedc08e7624618404508752c5a7e1c3a95c1f'}]}, 'timestamp': '2025-10-02 12:31:17.030547', '_unique_id': 'f237a5d460e34f04b7f1925d95b28fd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.032 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.033 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.033 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.write.latency volume: 8471839953 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.034 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9b79e29-afd4-446e-bd0f-2d6d40026451', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8471839953, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:17.033572', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b12e0440-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': 'dd6e4bdd966d2d6bb77925c8d89f8771a59867fb4abea3ea845eecfe3cd9e0e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:17.033572', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12e1728-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': 'a53e91e7a7130ae5194924cd27ea40f8b724ef98803a118d2492d00a61ec230f'}]}, 'timestamp': '2025-10-02 12:31:17.034593', '_unique_id': '5b750f2b6d5a45e4a0da9d09b90e5c16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.035 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.036 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ea18c92-fd26-4530-b8ae-3e9b211a1bcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:17.036851', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b12e7e70-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '4a8f017ca7495a658a061e5047669e7342a9dbf93401bcb93046e414d064797b'}]}, 'timestamp': '2025-10-02 12:31:17.037163', '_unique_id': '5a52758c68b341cea7120a7e9268e339'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.038 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39a373e4-1ee9-4a4d-8422-2e7bb756e37f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:17.038606', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b12ec3bc-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '6d73ff877f8ea2fbf55273b1275fdf6df6a8be05b69814c801229794a955e692'}]}, 'timestamp': '2025-10-02 12:31:17.038936', '_unique_id': '4d9e2e025fd64788aab15b6ddeb04a59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.039 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.040 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.040 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/memory.usage volume: 42.88671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a7587a6-c0c6-4a4a-9740-cc0e76bc1087', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.88671875, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'timestamp': '2025-10-02T12:31:17.040387', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b12f0822-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.705789674, 'message_signature': 'ed4d7940089c4226d4b4f946b6abdfc22520c8c4a38baec268893db8935bbbdd'}]}, 'timestamp': '2025-10-02 12:31:17.040677', '_unique_id': 'f728c7cd154a4868ba26cf5c2779e6c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.041 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.042 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2e12291-a776-4dc0-94c1-4cca527f111c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:17.042553', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b12f5dcc-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': 'a70a557946572bf6217b8c39caee9c9d4d932d87bce0415822ba6e9faa71b819'}]}, 'timestamp': '2025-10-02 12:31:17.042882', '_unique_id': 'dcb874d52c744aacb0772173e83e2a2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.043 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.044 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/network.outgoing.bytes volume: 15816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c769f453-0f9e-45b8-b548-d5dba9db81a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15816, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': 'instance-00000088-278a6b24-7950-4f1b-9c36-8a6030b17e6d-tapc8d25dbc-b5', 'timestamp': '2025-10-02T12:31:17.044357', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'tapc8d25dbc-b5', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:1d:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8d25dbc-b5'}, 'message_id': 'b12fa368-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.603465993, 'message_signature': '2319e86669bd85ab0aa95acf95c948b9e2dc8cbc561ee9e5997b66ca35dfddf6'}]}, 'timestamp': '2025-10-02 12:31:17.044664', '_unique_id': '4d42c49b59cf4643bddbbb4462786dff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.045 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.046 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.046 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1991691227>]
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.046 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.046 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.read.bytes volume: 30431744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.046 12 DEBUG ceilometer.compute.pollsters [-] 278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c953663-e06d-4c3e-913d-30a8abe124d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30431744, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-vda', 'timestamp': '2025-10-02T12:31:17.046480', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b12ff610-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': 'f69682339681c0e75b81db15578a2d77cf976cfde2b3b5076f60d4af3c1f9064'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_name': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_name': None, 'resource_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d-sda', 'timestamp': '2025-10-02T12:31:17.046480', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1991691227', 'name': 'instance-00000088', 'instance_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'instance_type': 'm1.nano', 'host': '92aa3f038cf51594e1907d94386077e3d0be8c0bede5a7061d01cefe', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13003b2-9f8b-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6270.610223421, 'message_signature': '3cd321c529edca967f6763035d817d699df0a7c38e3676a4e6f3b5c08457ae62'}]}, 'timestamp': '2025-10-02 12:31:17.047113', '_unique_id': 'f58609edee3e4b82a90d5badb3275fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:31:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:31:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:31:17 np0005466012 nova_compute[192063]: 2025-10-02 12:31:17.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:20 np0005466012 nova_compute[192063]: 2025-10-02 12:31:20.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:22 np0005466012 podman[242468]: 2025-10-02 12:31:22.140428969 +0000 UTC m=+0.057237251 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7)
Oct  2 08:31:22 np0005466012 podman[242467]: 2025-10-02 12:31:22.148213065 +0000 UTC m=+0.064562923 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:22 np0005466012 nova_compute[192063]: 2025-10-02 12:31:22.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466012 nova_compute[192063]: 2025-10-02 12:31:25.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:25 np0005466012 podman[242508]: 2025-10-02 12:31:25.127366578 +0000 UTC m=+0.045492994 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:31:25 np0005466012 podman[242507]: 2025-10-02 12:31:25.129490557 +0000 UTC m=+0.050099072 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.384 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.385 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.403 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.535 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.536 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.546 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.547 2 INFO nova.compute.claims [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.696 2 DEBUG nova.compute.provider_tree [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.719 2 DEBUG nova.scheduler.client.report [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.745 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.746 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.808 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.809 2 DEBUG nova.network.neutron [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.839 2 INFO nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:31:26 np0005466012 nova_compute[192063]: 2025-10-02 12:31:26.858 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.015 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.016 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.016 2 INFO nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Creating image(s)#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.017 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "/var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.017 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.018 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.032 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.059 2 DEBUG nova.policy [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.126 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.127 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.128 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.141 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.201 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.202 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.257 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.259 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.260 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.343 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.344 2 DEBUG nova.virt.disk.api [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Checking if we can resize image /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.344 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.412 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.415 2 DEBUG nova.virt.disk.api [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Cannot resize image /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.415 2 DEBUG nova.objects.instance [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f5bca08-a985-4674-b189-69cb180bfcfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.438 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.438 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Ensure instance console log exists: /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.439 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.439 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.440 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:27 np0005466012 nova_compute[192063]: 2025-10-02 12:31:27.985 2 DEBUG nova.network.neutron [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Successfully created port: 835edda9-50d4-452c-83a7-3fd3911c42d1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.853 2 DEBUG nova.network.neutron [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Successfully updated port: 835edda9-50d4-452c-83a7-3fd3911c42d1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.869 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-1f5bca08-a985-4674-b189-69cb180bfcfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.869 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-1f5bca08-a985-4674-b189-69cb180bfcfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.869 2 DEBUG nova.network.neutron [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.966 2 DEBUG nova.compute.manager [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received event network-changed-835edda9-50d4-452c-83a7-3fd3911c42d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.966 2 DEBUG nova.compute.manager [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Refreshing instance network info cache due to event network-changed-835edda9-50d4-452c-83a7-3fd3911c42d1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:31:28 np0005466012 nova_compute[192063]: 2025-10-02 12:31:28.967 2 DEBUG oslo_concurrency.lockutils [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1f5bca08-a985-4674-b189-69cb180bfcfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.004 2 DEBUG nova.network.neutron [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.791 2 DEBUG nova.network.neutron [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Updating instance_info_cache with network_info: [{"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.815 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-1f5bca08-a985-4674-b189-69cb180bfcfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.816 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Instance network_info: |[{"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.817 2 DEBUG oslo_concurrency.lockutils [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1f5bca08-a985-4674-b189-69cb180bfcfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.817 2 DEBUG nova.network.neutron [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Refreshing network info cache for port 835edda9-50d4-452c-83a7-3fd3911c42d1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.820 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Start _get_guest_xml network_info=[{"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.827 2 WARNING nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.833 2 DEBUG nova.virt.libvirt.host [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.833 2 DEBUG nova.virt.libvirt.host [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.838 2 DEBUG nova.virt.libvirt.host [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.839 2 DEBUG nova.virt.libvirt.host [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.840 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.840 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.841 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.841 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.841 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.841 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.842 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.842 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.842 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.842 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.843 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.843 2 DEBUG nova.virt.hardware [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.846 2 DEBUG nova.virt.libvirt.vif [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1299266099',display_name='tempest-TestNetworkBasicOps-server-1299266099',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1299266099',id=139,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKA0TaoJ74CwnSN103zW5+smi3KtvFUyNemkSlBn/5L7taF4/Cd9SKGUvtg1mQjzs1vN8/TomvHYveYcRmx6gom8/m2Uwx632ERipjwIC2QToMY5aLVMVaGpUwSe4nX1JA==',key_name='tempest-TestNetworkBasicOps-1677490737',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-xvp4q0bu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:26Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=1f5bca08-a985-4674-b189-69cb180bfcfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.847 2 DEBUG nova.network.os_vif_util [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.847 2 DEBUG nova.network.os_vif_util [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.848 2 DEBUG nova.objects.instance [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f5bca08-a985-4674-b189-69cb180bfcfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.863 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <uuid>1f5bca08-a985-4674-b189-69cb180bfcfa</uuid>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <name>instance-0000008b</name>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestNetworkBasicOps-server-1299266099</nova:name>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:31:29</nova:creationTime>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        <nova:port uuid="835edda9-50d4-452c-83a7-3fd3911c42d1">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <entry name="serial">1f5bca08-a985-4674-b189-69cb180bfcfa</entry>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <entry name="uuid">1f5bca08-a985-4674-b189-69cb180bfcfa</entry>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.config"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:ce:0a:e8"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <target dev="tap835edda9-50"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/console.log" append="off"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:31:29 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:31:29 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:31:29 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:31:29 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.864 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Preparing to wait for external event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.864 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.865 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.865 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.865 2 DEBUG nova.virt.libvirt.vif [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1299266099',display_name='tempest-TestNetworkBasicOps-server-1299266099',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1299266099',id=139,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKA0TaoJ74CwnSN103zW5+smi3KtvFUyNemkSlBn/5L7taF4/Cd9SKGUvtg1mQjzs1vN8/TomvHYveYcRmx6gom8/m2Uwx632ERipjwIC2QToMY5aLVMVaGpUwSe4nX1JA==',key_name='tempest-TestNetworkBasicOps-1677490737',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-xvp4q0bu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:31:26Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=1f5bca08-a985-4674-b189-69cb180bfcfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.866 2 DEBUG nova.network.os_vif_util [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.866 2 DEBUG nova.network.os_vif_util [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.867 2 DEBUG os_vif [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.867 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.868 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap835edda9-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap835edda9-50, col_values=(('external_ids', {'iface-id': '835edda9-50d4-452c-83a7-3fd3911c42d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:0a:e8', 'vm-uuid': '1f5bca08-a985-4674-b189-69cb180bfcfa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:29 np0005466012 NetworkManager[51207]: <info>  [1759408289.8736] manager: (tap835edda9-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.880 2 INFO os_vif [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50')#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.936 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.937 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.937 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:ce:0a:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:31:29 np0005466012 nova_compute[192063]: 2025-10-02 12:31:29.938 2 INFO nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Using config drive#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.334 2 INFO nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Creating config drive at /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.config#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.346 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ipjzijd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.482 2 DEBUG oslo_concurrency.processutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ipjzijd" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:30 np0005466012 NetworkManager[51207]: <info>  [1759408290.5483] manager: (tap835edda9-50): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Oct  2 08:31:30 np0005466012 kernel: tap835edda9-50: entered promiscuous mode
Oct  2 08:31:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:30Z|00532|binding|INFO|Claiming lport 835edda9-50d4-452c-83a7-3fd3911c42d1 for this chassis.
Oct  2 08:31:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:30Z|00533|binding|INFO|835edda9-50d4-452c-83a7-3fd3911c42d1: Claiming fa:16:3e:ce:0a:e8 10.100.0.19
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.566 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:0a:e8 10.100.0.19'], port_security=['fa:16:3e:ce:0a:e8 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '1f5bca08-a985-4674-b189-69cb180bfcfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65e5c40f-3361-413f-bb13-a3c691890c30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef78c178-9d50-41c7-a380-964531000563', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d77669f-fc6c-4f99-9b5d-e6723f9cdfa8, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=835edda9-50d4-452c-83a7-3fd3911c42d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.567 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 835edda9-50d4-452c-83a7-3fd3911c42d1 in datapath 65e5c40f-3361-413f-bb13-a3c691890c30 bound to our chassis#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.569 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65e5c40f-3361-413f-bb13-a3c691890c30#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.580 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cb887ee9-d151-4ea6-ba68-5f0dd5c86dfc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.581 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65e5c40f-31 in ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.583 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65e5c40f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.583 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2b8956-dd11-432a-902a-869ed373e35e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.584 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[883fdddf-17b8-49bd-91dd-7ec4ba716373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 systemd-machined[152114]: New machine qemu-64-instance-0000008b.
Oct  2 08:31:30 np0005466012 systemd-udevd[242584]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.597 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[45167b11-6fa4-420f-bd08-99b951b4f217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 NetworkManager[51207]: <info>  [1759408290.6096] device (tap835edda9-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:31:30 np0005466012 NetworkManager[51207]: <info>  [1759408290.6109] device (tap835edda9-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:31:30 np0005466012 systemd[1]: Started Virtual Machine qemu-64-instance-0000008b.
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:30Z|00534|binding|INFO|Setting lport 835edda9-50d4-452c-83a7-3fd3911c42d1 ovn-installed in OVS
Oct  2 08:31:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:30Z|00535|binding|INFO|Setting lport 835edda9-50d4-452c-83a7-3fd3911c42d1 up in Southbound
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.628 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8cb7d9-3135-4f00-8fc5-9045ab483542]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.663 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ebc5b5-98be-4360-a6b3-9b059c9c1d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.668 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ac30dba6-5043-40be-bbcb-c31fe71036ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 systemd-udevd[242588]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:31:30 np0005466012 NetworkManager[51207]: <info>  [1759408290.6695] manager: (tap65e5c40f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/246)
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.706 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[60baa47f-3d16-407c-9f32-45ef5c62535d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.710 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[472809e2-2b28-4793-8a07-9e3e2dac365e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 NetworkManager[51207]: <info>  [1759408290.7430] device (tap65e5c40f-30): carrier: link connected
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.749 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2977c267-e4a4-44f7-9c45-30d21b9494ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.767 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d811f882-e2e1-487a-a8d9-be756f0075b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65e5c40f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:50:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628435, 'reachable_time': 42850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242616, 'error': None, 'target': 'ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.784 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f740cc31-8bba-40e8-ab69-65b5978192cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe98:50d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628435, 'tstamp': 628435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242617, 'error': None, 'target': 'ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.810 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[946ca733-0826-4f4f-8a46-cb75d522f201]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65e5c40f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:50:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628435, 'reachable_time': 42850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242618, 'error': None, 'target': 'ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.849 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd75c0f-b111-48f9-a7bc-e0ed9ff8fea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.911 2 DEBUG nova.compute.manager [req-0c9a08b1-90c2-43ae-9bb5-33bc703c0c98 req-7f5c47c8-4f93-41f5-8d3a-551611f92198 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.913 2 DEBUG oslo_concurrency.lockutils [req-0c9a08b1-90c2-43ae-9bb5-33bc703c0c98 req-7f5c47c8-4f93-41f5-8d3a-551611f92198 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.913 2 DEBUG oslo_concurrency.lockutils [req-0c9a08b1-90c2-43ae-9bb5-33bc703c0c98 req-7f5c47c8-4f93-41f5-8d3a-551611f92198 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.913 2 DEBUG oslo_concurrency.lockutils [req-0c9a08b1-90c2-43ae-9bb5-33bc703c0c98 req-7f5c47c8-4f93-41f5-8d3a-551611f92198 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.914 2 DEBUG nova.compute.manager [req-0c9a08b1-90c2-43ae-9bb5-33bc703c0c98 req-7f5c47c8-4f93-41f5-8d3a-551611f92198 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Processing event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.928 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2acf0f6-da29-4347-aa4c-ec991a5f3fa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.929 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65e5c40f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.930 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.930 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65e5c40f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:30 np0005466012 NetworkManager[51207]: <info>  [1759408290.9785] manager: (tap65e5c40f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466012 kernel: tap65e5c40f-30: entered promiscuous mode
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.982 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65e5c40f-30, col_values=(('external_ids', {'iface-id': 'a5315141-0781-4656-86fd-41a4ad229c25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:30Z|00536|binding|INFO|Releasing lport a5315141-0781-4656-86fd-41a4ad229c25 from this chassis (sb_readonly=0)
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.987 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65e5c40f-3361-413f-bb13-a3c691890c30.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65e5c40f-3361-413f-bb13-a3c691890c30.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.988 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c82106cc-9e74-468a-85f1-51126a2eff61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.989 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-65e5c40f-3361-413f-bb13-a3c691890c30
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/65e5c40f-3361-413f-bb13-a3c691890c30.pid.haproxy
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 65e5c40f-3361-413f-bb13-a3c691890c30
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:31:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:30.989 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30', 'env', 'PROCESS_TAG=haproxy-65e5c40f-3361-413f-bb13-a3c691890c30', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65e5c40f-3361-413f-bb13-a3c691890c30.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:31:30 np0005466012 nova_compute[192063]: 2025-10-02 12:31:30.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.091 2 DEBUG nova.network.neutron [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Updated VIF entry in instance network info cache for port 835edda9-50d4-452c-83a7-3fd3911c42d1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.092 2 DEBUG nova.network.neutron [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Updating instance_info_cache with network_info: [{"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.108 2 DEBUG oslo_concurrency.lockutils [req-e87e07c8-bfe4-4bce-93c5-c598ae4467df req-ba6dc40c-2b63-42ca-9bbe-418b37c56272 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1f5bca08-a985-4674-b189-69cb180bfcfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.314 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.314 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408291.3134418, 1f5bca08-a985-4674-b189-69cb180bfcfa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.315 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] VM Started (Lifecycle Event)#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.318 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.321 2 INFO nova.virt.libvirt.driver [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Instance spawned successfully.#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.321 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:31:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:31Z|00537|binding|INFO|Releasing lport a5315141-0781-4656-86fd-41a4ad229c25 from this chassis (sb_readonly=0)
Oct  2 08:31:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:31Z|00538|binding|INFO|Releasing lport 42ed37b7-bd98-49bd-999b-805cbf42cd4d from this chassis (sb_readonly=0)
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.344 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.349 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.352 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.352 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.352 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.353 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.353 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.353 2 DEBUG nova.virt.libvirt.driver [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.381 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.381 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408291.314486, 1f5bca08-a985-4674-b189-69cb180bfcfa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.382 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.411 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.414 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408291.3182392, 1f5bca08-a985-4674-b189-69cb180bfcfa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.414 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:31:31 np0005466012 podman[242656]: 2025-10-02 12:31:31.321403041 +0000 UTC m=+0.023533144 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.429 2 INFO nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Took 4.41 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.429 2 DEBUG nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.432 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.436 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.468 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.529 2 INFO nova.compute.manager [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Took 5.06 seconds to build instance.#033[00m
Oct  2 08:31:31 np0005466012 podman[242656]: 2025-10-02 12:31:31.546641394 +0000 UTC m=+0.248771477 container create 1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.552 2 DEBUG oslo_concurrency.lockutils [None req-f8e7d065-f666-4a49-9ee9-ef308d9313cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:31 np0005466012 systemd[1]: Started libpod-conmon-1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9.scope.
Oct  2 08:31:31 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:31:31 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f7b0bf0988e8cbc677d88008705f7bfc12e56d8207d88728a23a9972301cd2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:31:31 np0005466012 podman[242656]: 2025-10-02 12:31:31.810229502 +0000 UTC m=+0.512359625 container init 1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:31:31 np0005466012 podman[242656]: 2025-10-02 12:31:31.816059974 +0000 UTC m=+0.518190067 container start 1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:31:31 np0005466012 nova_compute[192063]: 2025-10-02 12:31:31.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:31 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [NOTICE]   (242675) : New worker (242677) forked
Oct  2 08:31:31 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [NOTICE]   (242675) : Loading success.
Oct  2 08:31:32 np0005466012 nova_compute[192063]: 2025-10-02 12:31:32.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.028 2 DEBUG nova.compute.manager [req-d32ddd40-31cf-4ef9-a445-c726bc010e84 req-600dc687-1345-4e99-a39c-c17c4c592efd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.028 2 DEBUG oslo_concurrency.lockutils [req-d32ddd40-31cf-4ef9-a445-c726bc010e84 req-600dc687-1345-4e99-a39c-c17c4c592efd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.029 2 DEBUG oslo_concurrency.lockutils [req-d32ddd40-31cf-4ef9-a445-c726bc010e84 req-600dc687-1345-4e99-a39c-c17c4c592efd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.030 2 DEBUG oslo_concurrency.lockutils [req-d32ddd40-31cf-4ef9-a445-c726bc010e84 req-600dc687-1345-4e99-a39c-c17c4c592efd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.030 2 DEBUG nova.compute.manager [req-d32ddd40-31cf-4ef9-a445-c726bc010e84 req-600dc687-1345-4e99-a39c-c17c4c592efd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] No waiting events found dispatching network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.031 2 WARNING nova.compute.manager [req-d32ddd40-31cf-4ef9-a445-c726bc010e84 req-600dc687-1345-4e99-a39c-c17c4c592efd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received unexpected event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:33 np0005466012 nova_compute[192063]: 2025-10-02 12:31:33.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:34 np0005466012 nova_compute[192063]: 2025-10-02 12:31:34.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:37 np0005466012 nova_compute[192063]: 2025-10-02 12:31:37.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:37 np0005466012 nova_compute[192063]: 2025-10-02 12:31:37.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:38 np0005466012 nova_compute[192063]: 2025-10-02 12:31:38.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:39 np0005466012 nova_compute[192063]: 2025-10-02 12:31:39.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:39 np0005466012 nova_compute[192063]: 2025-10-02 12:31:39.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:40 np0005466012 podman[242686]: 2025-10-02 12:31:40.130449779 +0000 UTC m=+0.048685693 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:31:40 np0005466012 podman[242687]: 2025-10-02 12:31:40.157860889 +0000 UTC m=+0.075128096 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:31:40 np0005466012 nova_compute[192063]: 2025-10-02 12:31:40.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:40 np0005466012 nova_compute[192063]: 2025-10-02 12:31:40.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.854 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.855 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.922 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.983 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:41 np0005466012 nova_compute[192063]: 2025-10-02 12:31:41.984 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.062 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.072 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.134 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.148 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.217 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.393 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.397 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5359MB free_disk=73.21390914916992GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.398 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.399 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.531 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 278a6b24-7950-4f1b-9c36-8a6030b17e6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.532 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 1f5bca08-a985-4674-b189-69cb180bfcfa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.532 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.532 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.551 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.576 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.577 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.602 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.624 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.679 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.697 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:42Z|00539|binding|INFO|Releasing lport a5315141-0781-4656-86fd-41a4ad229c25 from this chassis (sb_readonly=0)
Oct  2 08:31:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:42Z|00540|binding|INFO|Releasing lport 42ed37b7-bd98-49bd-999b-805cbf42cd4d from this chassis (sb_readonly=0)
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.732 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.732 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:42 np0005466012 nova_compute[192063]: 2025-10-02 12:31:42.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:43 np0005466012 podman[242751]: 2025-10-02 12:31:43.144502472 +0000 UTC m=+0.058794753 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  2 08:31:43 np0005466012 podman[242750]: 2025-10-02 12:31:43.150877389 +0000 UTC m=+0.069663915 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:31:43 np0005466012 nova_compute[192063]: 2025-10-02 12:31:43.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:44 np0005466012 nova_compute[192063]: 2025-10-02 12:31:44.733 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:44 np0005466012 nova_compute[192063]: 2025-10-02 12:31:44.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:31:44 np0005466012 nova_compute[192063]: 2025-10-02 12:31:44.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:31:44 np0005466012 nova_compute[192063]: 2025-10-02 12:31:44.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:31:44 np0005466012 nova_compute[192063]: 2025-10-02 12:31:44.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:45 np0005466012 nova_compute[192063]: 2025-10-02 12:31:45.148 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:31:45 np0005466012 nova_compute[192063]: 2025-10-02 12:31:45.148 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:31:45 np0005466012 nova_compute[192063]: 2025-10-02 12:31:45.148 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:31:45 np0005466012 nova_compute[192063]: 2025-10-02 12:31:45.148 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 278a6b24-7950-4f1b-9c36-8a6030b17e6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:45Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:0a:e8 10.100.0.19
Oct  2 08:31:45 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:45Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:0a:e8 10.100.0.19
Oct  2 08:31:47 np0005466012 nova_compute[192063]: 2025-10-02 12:31:47.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:47 np0005466012 nova_compute[192063]: 2025-10-02 12:31:47.813 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updating instance_info_cache with network_info: [{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:47 np0005466012 nova_compute[192063]: 2025-10-02 12:31:47.831 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:31:47 np0005466012 nova_compute[192063]: 2025-10-02 12:31:47.831 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:31:49 np0005466012 nova_compute[192063]: 2025-10-02 12:31:49.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:52 np0005466012 nova_compute[192063]: 2025-10-02 12:31:52.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:52 np0005466012 nova_compute[192063]: 2025-10-02 12:31:52.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:53 np0005466012 podman[242808]: 2025-10-02 12:31:53.135764938 +0000 UTC m=+0.048511767 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct  2 08:31:53 np0005466012 podman[242807]: 2025-10-02 12:31:53.14592913 +0000 UTC m=+0.062211957 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:31:54 np0005466012 nova_compute[192063]: 2025-10-02 12:31:54.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.067 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.068 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.069 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.070 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.071 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.166 2 INFO nova.compute.manager [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Terminating instance#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.216 2 DEBUG nova.compute.manager [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:31:55 np0005466012 kernel: tap835edda9-50 (unregistering): left promiscuous mode
Oct  2 08:31:55 np0005466012 NetworkManager[51207]: <info>  [1759408315.2429] device (tap835edda9-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:55Z|00541|binding|INFO|Releasing lport 835edda9-50d4-452c-83a7-3fd3911c42d1 from this chassis (sb_readonly=0)
Oct  2 08:31:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:55Z|00542|binding|INFO|Setting lport 835edda9-50d4-452c-83a7-3fd3911c42d1 down in Southbound
Oct  2 08:31:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:31:55Z|00543|binding|INFO|Removing iface tap835edda9-50 ovn-installed in OVS
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.281 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:0a:e8 10.100.0.19'], port_security=['fa:16:3e:ce:0a:e8 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '1f5bca08-a985-4674-b189-69cb180bfcfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65e5c40f-3361-413f-bb13-a3c691890c30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef78c178-9d50-41c7-a380-964531000563', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d77669f-fc6c-4f99-9b5d-e6723f9cdfa8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=835edda9-50d4-452c-83a7-3fd3911c42d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.282 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 835edda9-50d4-452c-83a7-3fd3911c42d1 in datapath 65e5c40f-3361-413f-bb13-a3c691890c30 unbound from our chassis#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.284 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65e5c40f-3361-413f-bb13-a3c691890c30, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:31:55 np0005466012 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Oct  2 08:31:55 np0005466012 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008b.scope: Consumed 13.824s CPU time.
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.285 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a53e3e-d704-43a2-b94c-eb27fe5727a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.286 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30 namespace which is not needed anymore#033[00m
Oct  2 08:31:55 np0005466012 systemd-machined[152114]: Machine qemu-64-instance-0000008b terminated.
Oct  2 08:31:55 np0005466012 podman[242851]: 2025-10-02 12:31:55.341860613 +0000 UTC m=+0.061018246 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:31:55 np0005466012 podman[242850]: 2025-10-02 12:31:55.345330889 +0000 UTC m=+0.068775591 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.482 2 INFO nova.virt.libvirt.driver [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Instance destroyed successfully.#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.483 2 DEBUG nova.objects.instance [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'resources' on Instance uuid 1f5bca08-a985-4674-b189-69cb180bfcfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.528 2 DEBUG nova.virt.libvirt.vif [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1299266099',display_name='tempest-TestNetworkBasicOps-server-1299266099',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1299266099',id=139,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKA0TaoJ74CwnSN103zW5+smi3KtvFUyNemkSlBn/5L7taF4/Cd9SKGUvtg1mQjzs1vN8/TomvHYveYcRmx6gom8/m2Uwx632ERipjwIC2QToMY5aLVMVaGpUwSe4nX1JA==',key_name='tempest-TestNetworkBasicOps-1677490737',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:31:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-xvp4q0bu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:31:31Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=1f5bca08-a985-4674-b189-69cb180bfcfa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.528 2 DEBUG nova.network.os_vif_util [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "835edda9-50d4-452c-83a7-3fd3911c42d1", "address": "fa:16:3e:ce:0a:e8", "network": {"id": "65e5c40f-3361-413f-bb13-a3c691890c30", "bridge": "br-int", "label": "tempest-network-smoke--749433172", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap835edda9-50", "ovs_interfaceid": "835edda9-50d4-452c-83a7-3fd3911c42d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.529 2 DEBUG nova.network.os_vif_util [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.529 2 DEBUG os_vif [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.532 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835edda9-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.537 2 INFO os_vif [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:0a:e8,bridge_name='br-int',has_traffic_filtering=True,id=835edda9-50d4-452c-83a7-3fd3911c42d1,network=Network(65e5c40f-3361-413f-bb13-a3c691890c30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap835edda9-50')#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.537 2 INFO nova.virt.libvirt.driver [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Deleting instance files /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa_del#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.538 2 INFO nova.virt.libvirt.driver [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Deletion of /var/lib/nova/instances/1f5bca08-a985-4674-b189-69cb180bfcfa_del complete#033[00m
Oct  2 08:31:55 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [NOTICE]   (242675) : haproxy version is 2.8.14-c23fe91
Oct  2 08:31:55 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [NOTICE]   (242675) : path to executable is /usr/sbin/haproxy
Oct  2 08:31:55 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [WARNING]  (242675) : Exiting Master process...
Oct  2 08:31:55 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [ALERT]    (242675) : Current worker (242677) exited with code 143 (Terminated)
Oct  2 08:31:55 np0005466012 neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30[242671]: [WARNING]  (242675) : All workers exited. Exiting... (0)
Oct  2 08:31:55 np0005466012 systemd[1]: libpod-1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9.scope: Deactivated successfully.
Oct  2 08:31:55 np0005466012 podman[242910]: 2025-10-02 12:31:55.595206715 +0000 UTC m=+0.224238986 container died 1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:31:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9-userdata-shm.mount: Deactivated successfully.
Oct  2 08:31:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay-0f7b0bf0988e8cbc677d88008705f7bfc12e56d8207d88728a23a9972301cd2c-merged.mount: Deactivated successfully.
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.725 2 INFO nova.compute.manager [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.726 2 DEBUG oslo.service.loopingcall [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.726 2 DEBUG nova.compute.manager [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.726 2 DEBUG nova.network.neutron [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:31:55 np0005466012 podman[242910]: 2025-10-02 12:31:55.745460717 +0000 UTC m=+0.374492968 container cleanup 1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:55 np0005466012 systemd[1]: libpod-conmon-1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9.scope: Deactivated successfully.
Oct  2 08:31:55 np0005466012 podman[242959]: 2025-10-02 12:31:55.83310254 +0000 UTC m=+0.066525619 container remove 1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.838 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6479161b-9577-448f-a2dc-fe03b0805478]: (4, ('Thu Oct  2 12:31:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30 (1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9)\n1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9\nThu Oct  2 12:31:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30 (1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9)\n1c255eec77d8e62a9bb3fe3611d41e89829719c491d1295a8553b9ee3df52ce9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.840 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ad803c0e-8e42-4c33-90de-88f8d2f79b3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.841 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65e5c40f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 kernel: tap65e5c40f-30: left promiscuous mode
Oct  2 08:31:55 np0005466012 nova_compute[192063]: 2025-10-02 12:31:55.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.856 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6bfafd61-97dc-4995-8239-e5228b0dfc15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.887 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[da4044be-8260-499f-94fc-23ac13f55b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.888 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[94d04bd1-3299-4754-8e3e-a5953d246f4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.905 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c33a0f-49e8-4e60-8fa4-16336db6df75]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628426, 'reachable_time': 37204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242972, 'error': None, 'target': 'ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.908 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65e5c40f-3361-413f-bb13-a3c691890c30 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:31:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:31:55.908 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[64ed2610-3bf8-4a02-9964-25745a254e39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:31:55 np0005466012 systemd[1]: run-netns-ovnmeta\x2d65e5c40f\x2d3361\x2d413f\x2dbb13\x2da3c691890c30.mount: Deactivated successfully.
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.523 2 DEBUG nova.network.neutron [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.571 2 INFO nova.compute.manager [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Took 0.84 seconds to deallocate network for instance.#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.788 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.788 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.873 2 DEBUG nova.compute.provider_tree [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.889 2 DEBUG nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received event network-vif-unplugged-835edda9-50d4-452c-83a7-3fd3911c42d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.890 2 DEBUG oslo_concurrency.lockutils [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.890 2 DEBUG oslo_concurrency.lockutils [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.890 2 DEBUG oslo_concurrency.lockutils [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.891 2 DEBUG nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] No waiting events found dispatching network-vif-unplugged-835edda9-50d4-452c-83a7-3fd3911c42d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.891 2 WARNING nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received unexpected event network-vif-unplugged-835edda9-50d4-452c-83a7-3fd3911c42d1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.891 2 DEBUG nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.891 2 DEBUG oslo_concurrency.lockutils [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.892 2 DEBUG oslo_concurrency.lockutils [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.892 2 DEBUG oslo_concurrency.lockutils [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.892 2 DEBUG nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] No waiting events found dispatching network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.892 2 WARNING nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received unexpected event network-vif-plugged-835edda9-50d4-452c-83a7-3fd3911c42d1 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.893 2 DEBUG nova.compute.manager [req-61821f0f-592b-406a-9fee-0e48a2059994 req-e238bab6-2fb9-41f6-ae52-8b0aa22fb472 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Received event network-vif-deleted-835edda9-50d4-452c-83a7-3fd3911c42d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.895 2 DEBUG nova.scheduler.client.report [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.916 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:56 np0005466012 nova_compute[192063]: 2025-10-02 12:31:56.953 2 INFO nova.scheduler.client.report [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Deleted allocations for instance 1f5bca08-a985-4674-b189-69cb180bfcfa#033[00m
Oct  2 08:31:57 np0005466012 nova_compute[192063]: 2025-10-02 12:31:57.048 2 DEBUG oslo_concurrency.lockutils [None req-0982f731-3c20-49e0-9d10-fa3b2c151c46 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "1f5bca08-a985-4674-b189-69cb180bfcfa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:31:57 np0005466012 nova_compute[192063]: 2025-10-02 12:31:57.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:31:58 np0005466012 nova_compute[192063]: 2025-10-02 12:31:58.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:00 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:00Z|00544|binding|INFO|Releasing lport 42ed37b7-bd98-49bd-999b-805cbf42cd4d from this chassis (sb_readonly=0)
Oct  2 08:32:00 np0005466012 nova_compute[192063]: 2025-10-02 12:32:00.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:00 np0005466012 nova_compute[192063]: 2025-10-02 12:32:00.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.149 2 DEBUG nova.compute.manager [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-changed-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.149 2 DEBUG nova.compute.manager [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Refreshing instance network info cache due to event network-changed-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.149 2 DEBUG oslo_concurrency.lockutils [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.150 2 DEBUG oslo_concurrency.lockutils [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.150 2 DEBUG nova.network.neutron [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Refreshing network info cache for port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.221 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.222 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.222 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.223 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.223 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.238 2 INFO nova.compute.manager [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Terminating instance#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.254 2 DEBUG nova.compute.manager [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:32:01 np0005466012 kernel: tapc8d25dbc-b5 (unregistering): left promiscuous mode
Oct  2 08:32:01 np0005466012 NetworkManager[51207]: <info>  [1759408321.2794] device (tapc8d25dbc-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:32:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:01Z|00545|binding|INFO|Releasing lport c8d25dbc-b58e-4ab0-9b54-141b6fdd352c from this chassis (sb_readonly=0)
Oct  2 08:32:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:01Z|00546|binding|INFO|Setting lport c8d25dbc-b58e-4ab0-9b54-141b6fdd352c down in Southbound
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:01Z|00547|binding|INFO|Removing iface tapc8d25dbc-b5 ovn-installed in OVS
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.306 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:1d:9e 10.100.0.7'], port_security=['fa:16:3e:28:1d:9e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '278a6b24-7950-4f1b-9c36-8a6030b17e6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59ce980f-674e-478f-8a88-32561beb276a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c9faaa8-b796-47ef-a5cc-9fcd05d197d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30835f85-fac7-4f1d-86c6-e93cfff628ff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.310 103246 INFO neutron.agent.ovn.metadata.agent [-] Port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c in datapath 59ce980f-674e-478f-8a88-32561beb276a unbound from our chassis#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.312 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59ce980f-674e-478f-8a88-32561beb276a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.314 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[899f4ac8-582f-4fe9-aff8-8c48af21c5e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.315 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-59ce980f-674e-478f-8a88-32561beb276a namespace which is not needed anymore#033[00m
Oct  2 08:32:01 np0005466012 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000088.scope: Deactivated successfully.
Oct  2 08:32:01 np0005466012 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000088.scope: Consumed 15.923s CPU time.
Oct  2 08:32:01 np0005466012 systemd-machined[152114]: Machine qemu-63-instance-00000088 terminated.
Oct  2 08:32:01 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [NOTICE]   (242314) : haproxy version is 2.8.14-c23fe91
Oct  2 08:32:01 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [NOTICE]   (242314) : path to executable is /usr/sbin/haproxy
Oct  2 08:32:01 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [WARNING]  (242314) : Exiting Master process...
Oct  2 08:32:01 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [ALERT]    (242314) : Current worker (242316) exited with code 143 (Terminated)
Oct  2 08:32:01 np0005466012 neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a[242310]: [WARNING]  (242314) : All workers exited. Exiting... (0)
Oct  2 08:32:01 np0005466012 systemd[1]: libpod-36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370.scope: Deactivated successfully.
Oct  2 08:32:01 np0005466012 podman[242998]: 2025-10-02 12:32:01.461893431 +0000 UTC m=+0.044509007 container died 36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:01 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370-userdata-shm.mount: Deactivated successfully.
Oct  2 08:32:01 np0005466012 systemd[1]: var-lib-containers-storage-overlay-09752d3289c5233dc5391dbfab852872fe342ce61e842169037da274f5c78ed2-merged.mount: Deactivated successfully.
Oct  2 08:32:01 np0005466012 podman[242998]: 2025-10-02 12:32:01.497305804 +0000 UTC m=+0.079921390 container cleanup 36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:32:01 np0005466012 systemd[1]: libpod-conmon-36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370.scope: Deactivated successfully.
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.514 2 INFO nova.virt.libvirt.driver [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Instance destroyed successfully.#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.514 2 DEBUG nova.objects.instance [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'resources' on Instance uuid 278a6b24-7950-4f1b-9c36-8a6030b17e6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:01 np0005466012 podman[243046]: 2025-10-02 12:32:01.567293947 +0000 UTC m=+0.042805469 container remove 36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.572 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[201e69f1-6727-434b-a421-9b272a2c1693]: (4, ('Thu Oct  2 12:32:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a (36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370)\n36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370\nThu Oct  2 12:32:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-59ce980f-674e-478f-8a88-32561beb276a (36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370)\n36e0198def6e34351b3929d040998302af7ee976e339414db2a8dad8a78b0370\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.574 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c18ab2fc-833c-4c27-ac46-be3bb59c6fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.574 2 DEBUG nova.virt.libvirt.vif [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:30:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1991691227',display_name='tempest-TestNetworkBasicOps-server-1991691227',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1991691227',id=136,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNC/wrQNlb6KlN5Z6XL9qrRcQ/3+2UhlzIST7vbT+IX6q3g/aHALXbcl4WHv8B5x3DO8YryOAwWxfCs4B/hC3zlm+AKdkTIKvBvev1JrZYrxqMSTvoM5lzVVNSiunRoLZw==',key_name='tempest-TestNetworkBasicOps-862184460',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:30:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-gm2ceg0s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:30:52Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=278a6b24-7950-4f1b-9c36-8a6030b17e6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.574 2 DEBUG nova.network.os_vif_util [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.574 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59ce980f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.575 2 DEBUG nova.network.os_vif_util [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.575 2 DEBUG os_vif [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.577 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8d25dbc-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:01 np0005466012 kernel: tap59ce980f-60: left promiscuous mode
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.594 2 INFO os_vif [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:1d:9e,bridge_name='br-int',has_traffic_filtering=True,id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c,network=Network(59ce980f-674e-478f-8a88-32561beb276a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d25dbc-b5')#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.593 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d9af0c-ccdd-413f-a518-e97606ae43e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.594 2 INFO nova.virt.libvirt.driver [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Deleting instance files /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d_del#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.595 2 INFO nova.virt.libvirt.driver [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Deletion of /var/lib/nova/instances/278a6b24-7950-4f1b-9c36-8a6030b17e6d_del complete#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.627 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7497a722-d862-414c-bbb9-b5a07b628475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.628 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[db7a90e0-3494-4be8-a42e-9068d3f84447]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.643 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9be80835-647e-4c0a-b051-bbb2af147366]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624487, 'reachable_time': 44209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243062, 'error': None, 'target': 'ovnmeta-59ce980f-674e-478f-8a88-32561beb276a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 systemd[1]: run-netns-ovnmeta\x2d59ce980f\x2d674e\x2d478f\x2d8a88\x2d32561beb276a.mount: Deactivated successfully.
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.645 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-59ce980f-674e-478f-8a88-32561beb276a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:32:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:01.646 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[4749eef4-d0ea-4084-bc23-555fcefe85e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.711 2 INFO nova.compute.manager [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.711 2 DEBUG oslo.service.loopingcall [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.712 2 DEBUG nova.compute.manager [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:32:01 np0005466012 nova_compute[192063]: 2025-10-02 12:32:01.713 2 DEBUG nova.network.neutron [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.013 2 DEBUG nova.compute.manager [req-34312be6-01fa-43f2-a921-3b6f80f31925 req-e12511f7-a34c-4b62-8f21-6fc19a46877d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-vif-unplugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.013 2 DEBUG oslo_concurrency.lockutils [req-34312be6-01fa-43f2-a921-3b6f80f31925 req-e12511f7-a34c-4b62-8f21-6fc19a46877d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.013 2 DEBUG oslo_concurrency.lockutils [req-34312be6-01fa-43f2-a921-3b6f80f31925 req-e12511f7-a34c-4b62-8f21-6fc19a46877d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.013 2 DEBUG oslo_concurrency.lockutils [req-34312be6-01fa-43f2-a921-3b6f80f31925 req-e12511f7-a34c-4b62-8f21-6fc19a46877d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.014 2 DEBUG nova.compute.manager [req-34312be6-01fa-43f2-a921-3b6f80f31925 req-e12511f7-a34c-4b62-8f21-6fc19a46877d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] No waiting events found dispatching network-vif-unplugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.014 2 DEBUG nova.compute.manager [req-34312be6-01fa-43f2-a921-3b6f80f31925 req-e12511f7-a34c-4b62-8f21-6fc19a46877d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-vif-unplugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:32:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:02.146 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:02.147 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:02.147 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.710 2 DEBUG nova.network.neutron [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.738 2 INFO nova.compute.manager [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Took 1.03 seconds to deallocate network for instance.#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.860 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.861 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.943 2 DEBUG nova.compute.provider_tree [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:02 np0005466012 nova_compute[192063]: 2025-10-02 12:32:02.959 2 DEBUG nova.scheduler.client.report [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.000 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.035 2 INFO nova.scheduler.client.report [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Deleted allocations for instance 278a6b24-7950-4f1b-9c36-8a6030b17e6d#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.071 2 DEBUG nova.network.neutron [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updated VIF entry in instance network info cache for port c8d25dbc-b58e-4ab0-9b54-141b6fdd352c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.071 2 DEBUG nova.network.neutron [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Updating instance_info_cache with network_info: [{"id": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "address": "fa:16:3e:28:1d:9e", "network": {"id": "59ce980f-674e-478f-8a88-32561beb276a", "bridge": "br-int", "label": "tempest-network-smoke--1065656187", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d25dbc-b5", "ovs_interfaceid": "c8d25dbc-b58e-4ab0-9b54-141b6fdd352c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.230 2 DEBUG oslo_concurrency.lockutils [req-0b9192ec-4ea0-4894-9967-88cfb2331055 req-0062e42b-8514-4adc-81d6-0bfc104c5457 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-278a6b24-7950-4f1b-9c36-8a6030b17e6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.257 2 DEBUG oslo_concurrency.lockutils [None req-a370836e-f641-4ff8-b027-93f5e1a418ae a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.671 2 DEBUG nova.compute.manager [req-05c48b9d-7dc1-45b6-bae6-6211f5e197ac req-3bd95bc9-445f-4fe6-a824-ce89c52f1c09 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-vif-deleted-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.671 2 INFO nova.compute.manager [req-05c48b9d-7dc1-45b6-bae6-6211f5e197ac req-3bd95bc9-445f-4fe6-a824-ce89c52f1c09 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Neutron deleted interface c8d25dbc-b58e-4ab0-9b54-141b6fdd352c; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.671 2 DEBUG nova.network.neutron [req-05c48b9d-7dc1-45b6-bae6-6211f5e197ac req-3bd95bc9-445f-4fe6-a824-ce89c52f1c09 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:32:03 np0005466012 nova_compute[192063]: 2025-10-02 12:32:03.673 2 DEBUG nova.compute.manager [req-05c48b9d-7dc1-45b6-bae6-6211f5e197ac req-3bd95bc9-445f-4fe6-a824-ce89c52f1c09 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Detach interface failed, port_id=c8d25dbc-b58e-4ab0-9b54-141b6fdd352c, reason: Instance 278a6b24-7950-4f1b-9c36-8a6030b17e6d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:32:04 np0005466012 nova_compute[192063]: 2025-10-02 12:32:04.512 2 DEBUG nova.compute.manager [req-c89da1eb-cbf0-4f8d-a0c9-317c3c2dea56 req-16990395-c5f1-4e22-9971-1c80f0041509 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:04 np0005466012 nova_compute[192063]: 2025-10-02 12:32:04.512 2 DEBUG oslo_concurrency.lockutils [req-c89da1eb-cbf0-4f8d-a0c9-317c3c2dea56 req-16990395-c5f1-4e22-9971-1c80f0041509 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:04 np0005466012 nova_compute[192063]: 2025-10-02 12:32:04.513 2 DEBUG oslo_concurrency.lockutils [req-c89da1eb-cbf0-4f8d-a0c9-317c3c2dea56 req-16990395-c5f1-4e22-9971-1c80f0041509 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:04 np0005466012 nova_compute[192063]: 2025-10-02 12:32:04.513 2 DEBUG oslo_concurrency.lockutils [req-c89da1eb-cbf0-4f8d-a0c9-317c3c2dea56 req-16990395-c5f1-4e22-9971-1c80f0041509 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "278a6b24-7950-4f1b-9c36-8a6030b17e6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:04 np0005466012 nova_compute[192063]: 2025-10-02 12:32:04.513 2 DEBUG nova.compute.manager [req-c89da1eb-cbf0-4f8d-a0c9-317c3c2dea56 req-16990395-c5f1-4e22-9971-1c80f0041509 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] No waiting events found dispatching network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:04 np0005466012 nova_compute[192063]: 2025-10-02 12:32:04.514 2 WARNING nova.compute.manager [req-c89da1eb-cbf0-4f8d-a0c9-317c3c2dea56 req-16990395-c5f1-4e22-9971-1c80f0041509 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Received unexpected event network-vif-plugged-c8d25dbc-b58e-4ab0-9b54-141b6fdd352c for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:32:06 np0005466012 nova_compute[192063]: 2025-10-02 12:32:06.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:06 np0005466012 nova_compute[192063]: 2025-10-02 12:32:06.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:06 np0005466012 nova_compute[192063]: 2025-10-02 12:32:06.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.352 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "e2d02092-9a5f-4575-875b-f7eba9b563db" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.352 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.372 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.535 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.535 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.543 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.543 2 INFO nova.compute.claims [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.764 2 DEBUG nova.compute.provider_tree [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.806 2 DEBUG nova.scheduler.client.report [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.865 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:07 np0005466012 nova_compute[192063]: 2025-10-02 12:32:07.866 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.037 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.037 2 DEBUG nova.network.neutron [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.081 2 INFO nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.134 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.291 2 DEBUG nova.policy [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.399 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.400 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.401 2 INFO nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Creating image(s)#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.401 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "/var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.401 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.402 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.419 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.474 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.475 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.475 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.485 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.539 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.540 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.780 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk 1073741824" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.782 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.783 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.852 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.854 2 DEBUG nova.virt.disk.api [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Checking if we can resize image /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.854 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.919 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.921 2 DEBUG nova.virt.disk.api [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Cannot resize image /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:32:08 np0005466012 nova_compute[192063]: 2025-10-02 12:32:08.922 2 DEBUG nova.objects.instance [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'migration_context' on Instance uuid e2d02092-9a5f-4575-875b-f7eba9b563db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.072 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.073 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Ensure instance console log exists: /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.074 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.074 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.074 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:09.409 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:09 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:09.410 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:09 np0005466012 nova_compute[192063]: 2025-10-02 12:32:09.673 2 DEBUG nova.network.neutron [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Successfully created port: 7250b426-b4bd-44c1-98ab-439149ec8d83 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.435 2 DEBUG nova.network.neutron [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Successfully updated port: 7250b426-b4bd-44c1-98ab-439149ec8d83 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.452 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "refresh_cache-e2d02092-9a5f-4575-875b-f7eba9b563db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.452 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquired lock "refresh_cache-e2d02092-9a5f-4575-875b-f7eba9b563db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.452 2 DEBUG nova.network.neutron [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.481 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408315.4802089, 1f5bca08-a985-4674-b189-69cb180bfcfa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.482 2 INFO nova.compute.manager [-] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.522 2 DEBUG nova.compute.manager [None req-f60b74c8-02be-427d-b418-d53fbb99dc30 - - - - - -] [instance: 1f5bca08-a985-4674-b189-69cb180bfcfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.578 2 DEBUG nova.compute.manager [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Received event network-changed-7250b426-b4bd-44c1-98ab-439149ec8d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.578 2 DEBUG nova.compute.manager [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Refreshing instance network info cache due to event network-changed-7250b426-b4bd-44c1-98ab-439149ec8d83. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.578 2 DEBUG oslo_concurrency.lockutils [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e2d02092-9a5f-4575-875b-f7eba9b563db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:10 np0005466012 nova_compute[192063]: 2025-10-02 12:32:10.889 2 DEBUG nova.network.neutron [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:11 np0005466012 podman[243079]: 2025-10-02 12:32:11.1397617 +0000 UTC m=+0.057829347 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:32:11 np0005466012 podman[243080]: 2025-10-02 12:32:11.174116414 +0000 UTC m=+0.091000438 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:32:11 np0005466012 nova_compute[192063]: 2025-10-02 12:32:11.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.269 2 DEBUG nova.network.neutron [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Updating instance_info_cache with network_info: [{"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.303 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Releasing lock "refresh_cache-e2d02092-9a5f-4575-875b-f7eba9b563db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.304 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Instance network_info: |[{"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.304 2 DEBUG oslo_concurrency.lockutils [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e2d02092-9a5f-4575-875b-f7eba9b563db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.304 2 DEBUG nova.network.neutron [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Refreshing network info cache for port 7250b426-b4bd-44c1-98ab-439149ec8d83 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.309 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Start _get_guest_xml network_info=[{"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.313 2 WARNING nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.317 2 DEBUG nova.virt.libvirt.host [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.318 2 DEBUG nova.virt.libvirt.host [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.322 2 DEBUG nova.virt.libvirt.host [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.323 2 DEBUG nova.virt.libvirt.host [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.324 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.324 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.325 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.325 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.325 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.326 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.326 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.326 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.327 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.327 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.327 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.327 2 DEBUG nova.virt.hardware [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.332 2 DEBUG nova.virt.libvirt.vif [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-980775282',display_name='tempest-ServersTestJSON-server-980775282',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-980775282',id=142,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-c42ptz3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:08Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=e2d02092-9a5f-4575-875b-f7eba9b563db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.332 2 DEBUG nova.network.os_vif_util [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.333 2 DEBUG nova.network.os_vif_util [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.334 2 DEBUG nova.objects.instance [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'pci_devices' on Instance uuid e2d02092-9a5f-4575-875b-f7eba9b563db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.362 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <uuid>e2d02092-9a5f-4575-875b-f7eba9b563db</uuid>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <name>instance-0000008e</name>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersTestJSON-server-980775282</nova:name>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:32:12</nova:creationTime>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:user uuid="27daa263abb54d4d8e3ae34cd1c5ccf5">tempest-ServersTestJSON-1163535506-project-member</nova:user>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:project uuid="a4a7099974504a798e1607c8e6a1f570">tempest-ServersTestJSON-1163535506</nova:project>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        <nova:port uuid="7250b426-b4bd-44c1-98ab-439149ec8d83">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <entry name="serial">e2d02092-9a5f-4575-875b-f7eba9b563db</entry>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <entry name="uuid">e2d02092-9a5f-4575-875b-f7eba9b563db</entry>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.config"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:b0:16:0e"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <target dev="tap7250b426-b4"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/console.log" append="off"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:32:12 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:32:12 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:32:12 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:32:12 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.363 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Preparing to wait for external event network-vif-plugged-7250b426-b4bd-44c1-98ab-439149ec8d83 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.363 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.364 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.364 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.364 2 DEBUG nova.virt.libvirt.vif [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-980775282',display_name='tempest-ServersTestJSON-server-980775282',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-980775282',id=142,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-c42ptz3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:08Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=e2d02092-9a5f-4575-875b-f7eba9b563db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.365 2 DEBUG nova.network.os_vif_util [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.365 2 DEBUG nova.network.os_vif_util [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.365 2 DEBUG os_vif [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.366 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7250b426-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7250b426-b4, col_values=(('external_ids', {'iface-id': '7250b426-b4bd-44c1-98ab-439149ec8d83', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:16:0e', 'vm-uuid': 'e2d02092-9a5f-4575-875b-f7eba9b563db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:12 np0005466012 NetworkManager[51207]: <info>  [1759408332.3733] manager: (tap7250b426-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.379 2 INFO os_vif [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4')#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.452 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.453 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.453 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No VIF found with MAC fa:16:3e:b0:16:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.454 2 INFO nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Using config drive#033[00m
Oct  2 08:32:12 np0005466012 nova_compute[192063]: 2025-10-02 12:32:12.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.064 2 INFO nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Creating config drive at /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.config#033[00m
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.073 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9hek1ebo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.211 2 DEBUG oslo_concurrency.processutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9hek1ebo" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:13 np0005466012 kernel: tap7250b426-b4: entered promiscuous mode
Oct  2 08:32:13 np0005466012 NetworkManager[51207]: <info>  [1759408333.3094] manager: (tap7250b426-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Oct  2 08:32:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:13Z|00548|binding|INFO|Claiming lport 7250b426-b4bd-44c1-98ab-439149ec8d83 for this chassis.
Oct  2 08:32:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:13Z|00549|binding|INFO|7250b426-b4bd-44c1-98ab-439149ec8d83: Claiming fa:16:3e:b0:16:0e 10.100.0.12
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.333 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:0e 10.100.0.12'], port_security=['fa:16:3e:b0:16:0e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e2d02092-9a5f-4575-875b-f7eba9b563db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7250b426-b4bd-44c1-98ab-439149ec8d83) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.334 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7250b426-b4bd-44c1-98ab-439149ec8d83 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 bound to our chassis#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.336 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.353 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb495ece-38a4-4969-bd2a-94301fff9b13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.354 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1acf42c5-01 in ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.357 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1acf42c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.357 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f2684782-14da-4980-9219-40dc72bc0ae8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.358 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[639558a3-fa1c-4a91-805c-cbf957aa0eba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 systemd-udevd[243169]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:13 np0005466012 NetworkManager[51207]: <info>  [1759408333.3721] device (tap7250b426-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:13 np0005466012 NetworkManager[51207]: <info>  [1759408333.3731] device (tap7250b426-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.378 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[210167e7-9e07-4b86-abc5-7aa28334d49c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 podman[243140]: 2025-10-02 12:32:13.390972486 +0000 UTC m=+0.087367297 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:32:13 np0005466012 systemd-machined[152114]: New machine qemu-65-instance-0000008e.
Oct  2 08:32:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:13Z|00550|binding|INFO|Setting lport 7250b426-b4bd-44c1-98ab-439149ec8d83 ovn-installed in OVS
Oct  2 08:32:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:13Z|00551|binding|INFO|Setting lport 7250b426-b4bd-44c1-98ab-439149ec8d83 up in Southbound
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.399 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[574509e2-0400-4237-ab9b-70ebf47e49bb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 systemd[1]: Started Virtual Machine qemu-65-instance-0000008e.
Oct  2 08:32:13 np0005466012 podman[243139]: 2025-10-02 12:32:13.438985939 +0000 UTC m=+0.137450857 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.439 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4292b225-781d-4db6-ab82-e3faa35c98d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.446 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fe50ce80-b8c8-4cbe-8da2-9d60ffcf80d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 NetworkManager[51207]: <info>  [1759408333.4469] manager: (tap1acf42c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Oct  2 08:32:13 np0005466012 systemd-udevd[243175]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.477 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[aa64bee7-f915-4ee7-8b07-be80f1a64a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.481 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[177484ed-3552-4b48-8bf6-1dd241f69b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 NetworkManager[51207]: <info>  [1759408333.5068] device (tap1acf42c5-00): carrier: link connected
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.513 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[60cbed71-3391-4196-b4df-a5adbc4dc481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.531 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9abe2507-5236-45f0-ac28-8e17256225bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632711, 'reachable_time': 24207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243216, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.548 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[edf85be7-ccf2-47f7-96c6-e2473aba3544]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:5bcd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 632711, 'tstamp': 632711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243217, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.566 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5063e7f7-5b7f-4353-9716-067146b42a84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632711, 'reachable_time': 24207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243218, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.602 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[10e05d84-6380-4d7c-8f31-1a19f2797f88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.653 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d82aa4ad-8e4c-4204-a3c8-6d9906311b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.655 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.655 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.655 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:13 np0005466012 kernel: tap1acf42c5-00: entered promiscuous mode
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 NetworkManager[51207]: <info>  [1759408333.6881] manager: (tap1acf42c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.690 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:13Z|00552|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:32:13 np0005466012 nova_compute[192063]: 2025-10-02 12:32:13.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.705 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.706 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d0417faa-2e8d-4c2c-b259-7538a944fb9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.707 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:13.707 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'env', 'PROCESS_TAG=haproxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1acf42c5-084c-4cc4-bdc5-910eec0249e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:14 np0005466012 podman[243257]: 2025-10-02 12:32:14.053832987 +0000 UTC m=+0.045298228 container create 84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:32:14 np0005466012 systemd[1]: Started libpod-conmon-84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083.scope.
Oct  2 08:32:14 np0005466012 podman[243257]: 2025-10-02 12:32:14.029120931 +0000 UTC m=+0.020586202 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:14 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:32:14 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e346817365eaff7c2c928f343f13f39dbbe6babef0c6431830005fb5fb80dd60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:14 np0005466012 podman[243257]: 2025-10-02 12:32:14.152195008 +0000 UTC m=+0.143660339 container init 84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:14 np0005466012 podman[243257]: 2025-10-02 12:32:14.162463113 +0000 UTC m=+0.153928394 container start 84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:32:14 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [NOTICE]   (243277) : New worker (243279) forked
Oct  2 08:32:14 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [NOTICE]   (243277) : Loading success.
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.290 2 DEBUG nova.network.neutron [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Updated VIF entry in instance network info cache for port 7250b426-b4bd-44c1-98ab-439149ec8d83. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.291 2 DEBUG nova.network.neutron [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Updating instance_info_cache with network_info: [{"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.329 2 DEBUG oslo_concurrency.lockutils [req-525b3cae-6162-4242-820c-da7d68a15d00 req-af50f552-ce36-4477-bd66-3c8afc71a38e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e2d02092-9a5f-4575-875b-f7eba9b563db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.396 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408334.3960545, e2d02092-9a5f-4575-875b-f7eba9b563db => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.397 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.439 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.443 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408334.3988817, e2d02092-9a5f-4575-875b-f7eba9b563db => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.444 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.488 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.492 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:14 np0005466012 nova_compute[192063]: 2025-10-02 12:32:14.536 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.074 2 DEBUG nova.compute.manager [req-a932a498-4397-4316-a7d9-2bd2bb497682 req-46daa40b-762a-4d99-9492-129f5c5e6e2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Received event network-vif-plugged-7250b426-b4bd-44c1-98ab-439149ec8d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.074 2 DEBUG oslo_concurrency.lockutils [req-a932a498-4397-4316-a7d9-2bd2bb497682 req-46daa40b-762a-4d99-9492-129f5c5e6e2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.075 2 DEBUG oslo_concurrency.lockutils [req-a932a498-4397-4316-a7d9-2bd2bb497682 req-46daa40b-762a-4d99-9492-129f5c5e6e2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.075 2 DEBUG oslo_concurrency.lockutils [req-a932a498-4397-4316-a7d9-2bd2bb497682 req-46daa40b-762a-4d99-9492-129f5c5e6e2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.075 2 DEBUG nova.compute.manager [req-a932a498-4397-4316-a7d9-2bd2bb497682 req-46daa40b-762a-4d99-9492-129f5c5e6e2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Processing event network-vif-plugged-7250b426-b4bd-44c1-98ab-439149ec8d83 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.076 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.080 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408336.080391, e2d02092-9a5f-4575-875b-f7eba9b563db => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.080 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.082 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.086 2 INFO nova.virt.libvirt.driver [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Instance spawned successfully.#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.086 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.109 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.119 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.123 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.124 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.124 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.125 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.126 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.127 2 DEBUG nova.virt.libvirt.driver [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.156 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.201 2 INFO nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Took 7.80 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.202 2 DEBUG nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.316 2 INFO nova.compute.manager [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Took 8.88 seconds to build instance.#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.343 2 DEBUG oslo_concurrency.lockutils [None req-a139fd40-8b51-4039-bfbc-ea7b52b251e9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:16.413 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.513 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408321.5121126, 278a6b24-7950-4f1b-9c36-8a6030b17e6d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.513 2 INFO nova.compute.manager [-] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:32:16 np0005466012 nova_compute[192063]: 2025-10-02 12:32:16.545 2 DEBUG nova.compute.manager [None req-f61cfbc2-88b7-4e04-a690-f93472303a33 - - - - - -] [instance: 278a6b24-7950-4f1b-9c36-8a6030b17e6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:17 np0005466012 nova_compute[192063]: 2025-10-02 12:32:17.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:17 np0005466012 nova_compute[192063]: 2025-10-02 12:32:17.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.097 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "e2d02092-9a5f-4575-875b-f7eba9b563db" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.098 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.099 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.099 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.100 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.115 2 INFO nova.compute.manager [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Terminating instance#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.127 2 DEBUG nova.compute.manager [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:32:18 np0005466012 kernel: tap7250b426-b4 (unregistering): left promiscuous mode
Oct  2 08:32:18 np0005466012 NetworkManager[51207]: <info>  [1759408338.1561] device (tap7250b426-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:32:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:18Z|00553|binding|INFO|Releasing lport 7250b426-b4bd-44c1-98ab-439149ec8d83 from this chassis (sb_readonly=0)
Oct  2 08:32:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:18Z|00554|binding|INFO|Setting lport 7250b426-b4bd-44c1-98ab-439149ec8d83 down in Southbound
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:18Z|00555|binding|INFO|Removing iface tap7250b426-b4 ovn-installed in OVS
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.187 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:0e 10.100.0.12'], port_security=['fa:16:3e:b0:16:0e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e2d02092-9a5f-4575-875b-f7eba9b563db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7250b426-b4bd-44c1-98ab-439149ec8d83) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.191 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7250b426-b4bd-44c1-98ab-439149ec8d83 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 unbound from our chassis#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.194 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.195 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f5dc194a-b22b-41c3-a088-828a09b16ad1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.196 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace which is not needed anymore#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.207 2 DEBUG nova.compute.manager [req-ffb0fbba-0705-47ce-be61-e6c8174d0d53 req-c9cddc34-e4b7-411a-9cbf-8df34bdf5944 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Received event network-vif-plugged-7250b426-b4bd-44c1-98ab-439149ec8d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.208 2 DEBUG oslo_concurrency.lockutils [req-ffb0fbba-0705-47ce-be61-e6c8174d0d53 req-c9cddc34-e4b7-411a-9cbf-8df34bdf5944 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.208 2 DEBUG oslo_concurrency.lockutils [req-ffb0fbba-0705-47ce-be61-e6c8174d0d53 req-c9cddc34-e4b7-411a-9cbf-8df34bdf5944 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.209 2 DEBUG oslo_concurrency.lockutils [req-ffb0fbba-0705-47ce-be61-e6c8174d0d53 req-c9cddc34-e4b7-411a-9cbf-8df34bdf5944 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.209 2 DEBUG nova.compute.manager [req-ffb0fbba-0705-47ce-be61-e6c8174d0d53 req-c9cddc34-e4b7-411a-9cbf-8df34bdf5944 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] No waiting events found dispatching network-vif-plugged-7250b426-b4bd-44c1-98ab-439149ec8d83 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.210 2 WARNING nova.compute.manager [req-ffb0fbba-0705-47ce-be61-e6c8174d0d53 req-c9cddc34-e4b7-411a-9cbf-8df34bdf5944 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Received unexpected event network-vif-plugged-7250b426-b4bd-44c1-98ab-439149ec8d83 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:32:18 np0005466012 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008e.scope: Consumed 3.013s CPU time.
Oct  2 08:32:18 np0005466012 systemd-machined[152114]: Machine qemu-65-instance-0000008e terminated.
Oct  2 08:32:18 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [NOTICE]   (243277) : haproxy version is 2.8.14-c23fe91
Oct  2 08:32:18 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [NOTICE]   (243277) : path to executable is /usr/sbin/haproxy
Oct  2 08:32:18 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [WARNING]  (243277) : Exiting Master process...
Oct  2 08:32:18 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [ALERT]    (243277) : Current worker (243279) exited with code 143 (Terminated)
Oct  2 08:32:18 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243273]: [WARNING]  (243277) : All workers exited. Exiting... (0)
Oct  2 08:32:18 np0005466012 systemd[1]: libpod-84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083.scope: Deactivated successfully.
Oct  2 08:32:18 np0005466012 podman[243312]: 2025-10-02 12:32:18.355100015 +0000 UTC m=+0.051587663 container died 84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:32:18 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083-userdata-shm.mount: Deactivated successfully.
Oct  2 08:32:18 np0005466012 systemd[1]: var-lib-containers-storage-overlay-e346817365eaff7c2c928f343f13f39dbbe6babef0c6431830005fb5fb80dd60-merged.mount: Deactivated successfully.
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.389 2 INFO nova.virt.libvirt.driver [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Instance destroyed successfully.#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.394 2 DEBUG nova.objects.instance [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'resources' on Instance uuid e2d02092-9a5f-4575-875b-f7eba9b563db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.408 2 DEBUG nova.virt.libvirt.vif [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-980775282',display_name='tempest-ServersTestJSON-server-980775282',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-980775282',id=142,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-c42ptz3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:16Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=e2d02092-9a5f-4575-875b-f7eba9b563db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.409 2 DEBUG nova.network.os_vif_util [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "7250b426-b4bd-44c1-98ab-439149ec8d83", "address": "fa:16:3e:b0:16:0e", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7250b426-b4", "ovs_interfaceid": "7250b426-b4bd-44c1-98ab-439149ec8d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.410 2 DEBUG nova.network.os_vif_util [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.410 2 DEBUG os_vif [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.412 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7250b426-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:18 np0005466012 podman[243312]: 2025-10-02 12:32:18.413076634 +0000 UTC m=+0.109564272 container cleanup 84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 systemd[1]: libpod-conmon-84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083.scope: Deactivated successfully.
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.461 2 INFO os_vif [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:0e,bridge_name='br-int',has_traffic_filtering=True,id=7250b426-b4bd-44c1-98ab-439149ec8d83,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7250b426-b4')#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.462 2 INFO nova.virt.libvirt.driver [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Deleting instance files /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db_del#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.463 2 INFO nova.virt.libvirt.driver [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Deletion of /var/lib/nova/instances/e2d02092-9a5f-4575-875b-f7eba9b563db_del complete#033[00m
Oct  2 08:32:18 np0005466012 podman[243357]: 2025-10-02 12:32:18.524437106 +0000 UTC m=+0.046994096 container remove 84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.530 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d2ed4b-4456-4625-8590-65e697f80692]: (4, ('Thu Oct  2 12:32:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083)\n84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083\nThu Oct  2 12:32:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083)\n84637d8d28194216443915901db2ec583276a8b4e111315a4d15d482270be083\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.532 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bd82fbda-61a0-427c-b849-e45c5abcd68b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.532 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 kernel: tap1acf42c5-00: left promiscuous mode
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.552 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb319c7-13eb-4e78-b8bd-8e0d692050e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.565 2 INFO nova.compute.manager [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.565 2 DEBUG oslo.service.loopingcall [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.566 2 DEBUG nova.compute.manager [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:32:18 np0005466012 nova_compute[192063]: 2025-10-02 12:32:18.566 2 DEBUG nova.network.neutron [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.582 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c04f011f-84a7-4e76-90f8-4fe3052d10db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.583 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5040381d-542f-4156-8495-3d110ef335a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.598 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ff96a3fb-3e5b-4465-9946-41dda5838755]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632704, 'reachable_time': 15696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243373, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.600 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:32:18 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:18.600 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[54bc7c49-5bb2-4e3d-84cb-c12c9a426d9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:18 np0005466012 systemd[1]: run-netns-ovnmeta\x2d1acf42c5\x2d084c\x2d4cc4\x2dbdc5\x2d910eec0249e3.mount: Deactivated successfully.
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.549 2 DEBUG nova.network.neutron [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.569 2 INFO nova.compute.manager [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Took 1.00 seconds to deallocate network for instance.#033[00m
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.839 2 DEBUG nova.compute.manager [req-7e9d1145-fd2b-4fb1-a6ee-b715b2959ae1 req-c70b95ef-e94a-48ef-af96-2f0cc7898eba 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Received event network-vif-deleted-7250b426-b4bd-44c1-98ab-439149ec8d83 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.841 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.842 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.900 2 DEBUG nova.compute.provider_tree [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:19 np0005466012 nova_compute[192063]: 2025-10-02 12:32:19.941 2 DEBUG nova.scheduler.client.report [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:20 np0005466012 nova_compute[192063]: 2025-10-02 12:32:20.063 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:20 np0005466012 nova_compute[192063]: 2025-10-02 12:32:20.166 2 INFO nova.scheduler.client.report [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Deleted allocations for instance e2d02092-9a5f-4575-875b-f7eba9b563db#033[00m
Oct  2 08:32:20 np0005466012 nova_compute[192063]: 2025-10-02 12:32:20.351 2 DEBUG oslo_concurrency.lockutils [None req-2b6d845a-646b-4ef3-a441-1710730fe502 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "e2d02092-9a5f-4575-875b-f7eba9b563db" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:22 np0005466012 nova_compute[192063]: 2025-10-02 12:32:22.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:23 np0005466012 nova_compute[192063]: 2025-10-02 12:32:23.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:24 np0005466012 podman[243377]: 2025-10-02 12:32:24.161197317 +0000 UTC m=+0.071531146 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:24 np0005466012 podman[243378]: 2025-10-02 12:32:24.16559667 +0000 UTC m=+0.076168115 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.189 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.190 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.220 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.283 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.284 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.308 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.387 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.388 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.397 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.398 2 INFO nova.compute.claims [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.466 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.629 2 DEBUG nova.compute.provider_tree [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.644 2 DEBUG nova.scheduler.client.report [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.670 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.670 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.672 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.679 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.680 2 INFO nova.compute.claims [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.743 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.743 2 DEBUG nova.network.neutron [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.770 2 INFO nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.791 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.881 2 DEBUG nova.compute.provider_tree [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.898 2 DEBUG nova.scheduler.client.report [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.913 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.914 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.914 2 INFO nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Creating image(s)#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.915 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "/var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.915 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.916 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.926 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.927 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.929 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.968 2 DEBUG nova.policy [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.984 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.985 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.986 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:24 np0005466012 nova_compute[192063]: 2025-10-02 12:32:24.999 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.020 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.021 2 DEBUG nova.network.neutron [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.040 2 INFO nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.055 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.056 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.073 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.144 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk 1073741824" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.146 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.146 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.202 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.203 2 DEBUG nova.virt.disk.api [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Checking if we can resize image /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.203 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.222 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.225 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.225 2 INFO nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Creating image(s)#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.226 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.226 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.227 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.243 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.262 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.263 2 DEBUG nova.virt.disk.api [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Cannot resize image /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.263 2 DEBUG nova.objects.instance [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'migration_context' on Instance uuid 1aa209df-0181-4837-8968-a256ec63b072 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.294 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.295 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Ensure instance console log exists: /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.295 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.295 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.296 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.297 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.297 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.297 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.307 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.325 2 DEBUG nova.policy [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.367 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.368 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.404 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.406 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.406 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.496 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.497 2 DEBUG nova.virt.disk.api [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Checking if we can resize image /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.498 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.564 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.565 2 DEBUG nova.virt.disk.api [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Cannot resize image /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.565 2 DEBUG nova.objects.instance [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.609 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.610 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Ensure instance console log exists: /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.610 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.610 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:25 np0005466012 nova_compute[192063]: 2025-10-02 12:32:25.611 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:26 np0005466012 nova_compute[192063]: 2025-10-02 12:32:26.002 2 DEBUG nova.network.neutron [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Successfully created port: a0f7121e-c583-4afe-887a-d9ceec5a6d1d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:26 np0005466012 podman[243444]: 2025-10-02 12:32:26.169096318 +0000 UTC m=+0.074225282 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:32:26 np0005466012 podman[243445]: 2025-10-02 12:32:26.175073434 +0000 UTC m=+0.077529284 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:32:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:26.246 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:76:2a 2001:db8:0:1:f816:3eff:fed0:762a 2001:db8::f816:3eff:fed0:762a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fed0:762a/64 2001:db8::f816:3eff:fed0:762a/64', 'neutron:device_id': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512667a6-6958-4dd6-8891-fcda7d607ab5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=763e1f51-8560-461a-a2f3-3c284c8e5a17) old=Port_Binding(mac=['fa:16:3e:d0:76:2a 2001:db8::f816:3eff:fed0:762a'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed0:762a/64', 'neutron:device_id': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:26.247 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 763e1f51-8560-461a-a2f3-3c284c8e5a17 in datapath f55e0845-fc62-481d-a70d-8546faf2b8fb updated#033[00m
Oct  2 08:32:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:26.249 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f55e0845-fc62-481d-a70d-8546faf2b8fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:32:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:26.250 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c8aef397-3c4b-4326-9822-a81726c4c990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:26 np0005466012 nova_compute[192063]: 2025-10-02 12:32:26.919 2 DEBUG nova.network.neutron [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Successfully created port: d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.462 2 DEBUG nova.network.neutron [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Successfully updated port: a0f7121e-c583-4afe-887a-d9ceec5a6d1d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.490 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "refresh_cache-1aa209df-0181-4837-8968-a256ec63b072" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.490 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquired lock "refresh_cache-1aa209df-0181-4837-8968-a256ec63b072" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.490 2 DEBUG nova.network.neutron [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.644 2 DEBUG nova.compute.manager [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-changed-a0f7121e-c583-4afe-887a-d9ceec5a6d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.644 2 DEBUG nova.compute.manager [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Refreshing instance network info cache due to event network-changed-a0f7121e-c583-4afe-887a-d9ceec5a6d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.645 2 DEBUG oslo_concurrency.lockutils [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1aa209df-0181-4837-8968-a256ec63b072" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:27 np0005466012 nova_compute[192063]: 2025-10-02 12:32:27.733 2 DEBUG nova.network.neutron [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.130 2 DEBUG nova.network.neutron [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Successfully updated port: d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.205 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.205 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.206 2 DEBUG nova.network.neutron [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.489 2 DEBUG nova.network.neutron [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.827 2 DEBUG nova.network.neutron [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Updating instance_info_cache with network_info: [{"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.934 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Releasing lock "refresh_cache-1aa209df-0181-4837-8968-a256ec63b072" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.934 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Instance network_info: |[{"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.935 2 DEBUG oslo_concurrency.lockutils [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1aa209df-0181-4837-8968-a256ec63b072" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.936 2 DEBUG nova.network.neutron [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Refreshing network info cache for port a0f7121e-c583-4afe-887a-d9ceec5a6d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.940 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Start _get_guest_xml network_info=[{"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.949 2 WARNING nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.955 2 DEBUG nova.virt.libvirt.host [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.956 2 DEBUG nova.virt.libvirt.host [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.962 2 DEBUG nova.virt.libvirt.host [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.962 2 DEBUG nova.virt.libvirt.host [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.964 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.965 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.966 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.966 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.967 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.967 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.967 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.968 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.969 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.969 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.969 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.970 2 DEBUG nova.virt.hardware [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.977 2 DEBUG nova.virt.libvirt.vif [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1923059675',display_name='tempest-ServersTestJSON-server-1923059675',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1923059675',id=143,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDYy+9w7bvJVR5J7fWNcydO3VDnx5LcnjPGAgITqqsdwswVtis+rcBpRPey6+U2Rdm147agicWwcfmfB4o9tNv/0rkHSQ56vechz7NDqp4/fw8ZMQghoM4Bx5Kij0C4B3g==',key_name='tempest-key-684128559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-gmt9rv0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:24Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=1aa209df-0181-4837-8968-a256ec63b072,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.977 2 DEBUG nova.network.os_vif_util [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.979 2 DEBUG nova.network.os_vif_util [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:28 np0005466012 nova_compute[192063]: 2025-10-02 12:32:28.980 2 DEBUG nova.objects.instance [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1aa209df-0181-4837-8968-a256ec63b072 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.010 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <uuid>1aa209df-0181-4837-8968-a256ec63b072</uuid>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <name>instance-0000008f</name>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersTestJSON-server-1923059675</nova:name>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:32:28</nova:creationTime>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:user uuid="27daa263abb54d4d8e3ae34cd1c5ccf5">tempest-ServersTestJSON-1163535506-project-member</nova:user>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:project uuid="a4a7099974504a798e1607c8e6a1f570">tempest-ServersTestJSON-1163535506</nova:project>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:port uuid="a0f7121e-c583-4afe-887a-d9ceec5a6d1d">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="serial">1aa209df-0181-4837-8968-a256ec63b072</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="uuid">1aa209df-0181-4837-8968-a256ec63b072</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.config"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:5b:13:a2"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <target dev="tapa0f7121e-c5"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/console.log" append="off"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:32:29 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:32:29 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.012 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Preparing to wait for external event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.012 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.012 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.012 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.013 2 DEBUG nova.virt.libvirt.vif [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1923059675',display_name='tempest-ServersTestJSON-server-1923059675',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1923059675',id=143,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDYy+9w7bvJVR5J7fWNcydO3VDnx5LcnjPGAgITqqsdwswVtis+rcBpRPey6+U2Rdm147agicWwcfmfB4o9tNv/0rkHSQ56vechz7NDqp4/fw8ZMQghoM4Bx5Kij0C4B3g==',key_name='tempest-key-684128559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-gmt9rv0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:24Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=1aa209df-0181-4837-8968-a256ec63b072,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.013 2 DEBUG nova.network.os_vif_util [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.014 2 DEBUG nova.network.os_vif_util [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.014 2 DEBUG os_vif [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.015 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.015 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.019 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0f7121e-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.019 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa0f7121e-c5, col_values=(('external_ids', {'iface-id': 'a0f7121e-c583-4afe-887a-d9ceec5a6d1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:13:a2', 'vm-uuid': '1aa209df-0181-4837-8968-a256ec63b072'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.0230] manager: (tapa0f7121e-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.031 2 INFO os_vif [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5')#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.160 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.161 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.161 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No VIF found with MAC fa:16:3e:5b:13:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.162 2 INFO nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Using config drive#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.605 2 INFO nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Creating config drive at /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.config#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.610 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphou2sy_f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.726 2 DEBUG nova.network.neutron [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.731 2 DEBUG oslo_concurrency.processutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphou2sy_f" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.784 2 DEBUG nova.compute.manager [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-changed-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.785 2 DEBUG nova.compute.manager [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing instance network info cache due to event network-changed-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.786 2 DEBUG oslo_concurrency.lockutils [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.787 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.787 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Instance network_info: |[{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.788 2 DEBUG oslo_concurrency.lockutils [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.788 2 DEBUG nova.network.neutron [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing network info cache for port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.796 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Start _get_guest_xml network_info=[{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:29 np0005466012 kernel: tapa0f7121e-c5: entered promiscuous mode
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.8073] manager: (tapa0f7121e-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Oct  2 08:32:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:29Z|00556|binding|INFO|Claiming lport a0f7121e-c583-4afe-887a-d9ceec5a6d1d for this chassis.
Oct  2 08:32:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:29Z|00557|binding|INFO|a0f7121e-c583-4afe-887a-d9ceec5a6d1d: Claiming fa:16:3e:5b:13:a2 10.100.0.14
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.810 2 WARNING nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:29Z|00558|binding|INFO|Setting lport a0f7121e-c583-4afe-887a-d9ceec5a6d1d ovn-installed in OVS
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:29Z|00559|binding|INFO|Setting lport a0f7121e-c583-4afe-887a-d9ceec5a6d1d up in Southbound
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.831 2 DEBUG nova.virt.libvirt.host [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.832 2 DEBUG nova.virt.libvirt.host [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.833 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:13:a2 10.100.0.14'], port_security=['fa:16:3e:5b:13:a2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1aa209df-0181-4837-8968-a256ec63b072', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=a0f7121e-c583-4afe-887a-d9ceec5a6d1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.836 103246 INFO neutron.agent.ovn.metadata.agent [-] Port a0f7121e-c583-4afe-887a-d9ceec5a6d1d in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 bound to our chassis#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.836 2 DEBUG nova.virt.libvirt.host [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.837 2 DEBUG nova.virt.libvirt.host [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.838 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.838 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.839 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.839 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.839 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.839 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.839 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.840 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.840 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.840 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.840 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.840 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.841 2 DEBUG nova.virt.hardware [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:29 np0005466012 systemd-udevd[243507]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.845 2 DEBUG nova.virt.libvirt.vif [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:25Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.846 2 DEBUG nova.network.os_vif_util [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.847 2 DEBUG nova.network.os_vif_util [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.848 2 DEBUG nova.objects.instance [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.858 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c369d2-5811-4584-a2d6-5cd2f67bc718]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.8608] device (tapa0f7121e-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.860 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1acf42c5-01 in ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.8632] device (tapa0f7121e-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.864 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1acf42c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.864 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9482f3c6-2a6a-4876-8835-01e6a93276bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 systemd-machined[152114]: New machine qemu-66-instance-0000008f.
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.866 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2426dd-5973-4aaa-b975-8e524d4c2051]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.872 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <uuid>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</uuid>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <name>instance-00000090</name>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:32:29</nova:creationTime>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="serial">4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="uuid">4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:0e:af:b2"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <target dev="tapd8ac1c56-cb"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log" append="off"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:32:29 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.878 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f3177759-da7c-4394-a635-0840d1db92ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:32:29 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:32:29 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:32:29 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.872 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Preparing to wait for external event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.872 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.872 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.873 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.873 2 DEBUG nova.virt.libvirt.vif [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:25Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.874 2 DEBUG nova.network.os_vif_util [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.878 2 DEBUG nova.network.os_vif_util [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.879 2 DEBUG os_vif [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:29 np0005466012 systemd[1]: Started Virtual Machine qemu-66-instance-0000008f.
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.886 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8ac1c56-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.887 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd8ac1c56-cb, col_values=(('external_ids', {'iface-id': 'd8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:af:b2', 'vm-uuid': '4b2b0338-e64b-41eb-8902-3d7a95c6ffb1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.8914] manager: (tapd8ac1c56-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.894 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8498092b-462d-42ae-86d3-58b53591153e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.896 2 INFO os_vif [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb')#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.923 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3f9e4f07-1df7-4b5e-b223-c4ba8a1a502f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.9288] manager: (tap1acf42c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/255)
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.928 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8f47512f-7fb4-4ca6-9617-a7d5ae586cb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.957 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[48f34f2e-41fb-4f22-a7cc-af5f2f1c876d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.962 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3100e00d-0447-46a2-850a-78817fd99257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 NetworkManager[51207]: <info>  [1759408349.9840] device (tap1acf42c5-00): carrier: link connected
Oct  2 08:32:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:29.991 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[080d6673-afda-4fe3-a5d8-e535e4f69823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.993 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.994 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.994 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:0e:af:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:29 np0005466012 nova_compute[192063]: 2025-10-02 12:32:29.994 2 INFO nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Using config drive#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.012 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cf20973c-181a-499a-a0ec-75c91761a6db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634359, 'reachable_time': 43374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243547, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.030 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a9874f-877c-4fcd-9df4-9c4f8fa77636]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:5bcd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 634359, 'tstamp': 634359}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243548, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.050 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9550fa03-0405-47c0-b6d9-46238696680e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634359, 'reachable_time': 43374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243549, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.081 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4adc43b7-6051-4127-b349-5d2c59c98b59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.158 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d4e57eec-626f-4ae6-93a0-26e14991f144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.159 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.159 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.159 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:30 np0005466012 NetworkManager[51207]: <info>  [1759408350.1620] manager: (tap1acf42c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Oct  2 08:32:30 np0005466012 kernel: tap1acf42c5-00: entered promiscuous mode
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.167 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:30Z|00560|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.187 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.188 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a2de4112-d8be-4985-8776-335e0dd28ece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.189 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:30.193 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'env', 'PROCESS_TAG=haproxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1acf42c5-084c-4cc4-bdc5-910eec0249e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.587 2 DEBUG nova.network.neutron [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Updated VIF entry in instance network info cache for port a0f7121e-c583-4afe-887a-d9ceec5a6d1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.588 2 DEBUG nova.network.neutron [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Updating instance_info_cache with network_info: [{"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:30 np0005466012 podman[243593]: 2025-10-02 12:32:30.602160924 +0000 UTC m=+0.059611056 container create e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:32:30 np0005466012 systemd[1]: Started libpod-conmon-e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf.scope.
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.654 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408350.6534147, 1aa209df-0181-4837-8968-a256ec63b072 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.655 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:30 np0005466012 podman[243593]: 2025-10-02 12:32:30.571458192 +0000 UTC m=+0.028908334 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:30 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:32:30 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ff94ca9cb6a788bd860fbee35ce495d7e256ac2973b016e3fc5fc8bc935f662/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.684 2 DEBUG oslo_concurrency.lockutils [req-e43823c8-2a34-42fc-89b8-ac3b1bfec134 req-bc9a7305-2403-483a-9f44-dbc97886e0b0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1aa209df-0181-4837-8968-a256ec63b072" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:30 np0005466012 podman[243593]: 2025-10-02 12:32:30.68989736 +0000 UTC m=+0.147347492 container init e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:30 np0005466012 podman[243593]: 2025-10-02 12:32:30.69815995 +0000 UTC m=+0.155610082 container start e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:32:30 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [NOTICE]   (243613) : New worker (243615) forked
Oct  2 08:32:30 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [NOTICE]   (243613) : Loading success.
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.791 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.796 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408350.6546786, 1aa209df-0181-4837-8968-a256ec63b072 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.797 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.888 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.891 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.907 2 INFO nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Creating config drive at /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.912 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplhh5zb_l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:30 np0005466012 nova_compute[192063]: 2025-10-02 12:32:30.932 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.038 2 DEBUG oslo_concurrency.processutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplhh5zb_l" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:31 np0005466012 NetworkManager[51207]: <info>  [1759408351.1172] manager: (tapd8ac1c56-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Oct  2 08:32:31 np0005466012 kernel: tapd8ac1c56-cb: entered promiscuous mode
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:31Z|00561|binding|INFO|Claiming lport d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef for this chassis.
Oct  2 08:32:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:31Z|00562|binding|INFO|d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef: Claiming fa:16:3e:0e:af:b2 10.100.0.12
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.136 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:af:b2 10.100.0.12'], port_security=['fa:16:3e:0e:af:b2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4b2b0338-e64b-41eb-8902-3d7a95c6ffb1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b5ef964e-9316-47d6-a3f7-a6731e9e6be2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd8169e5-966f-4f69-9e79-03a4ed7aea2e, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:31 np0005466012 NetworkManager[51207]: <info>  [1759408351.1382] device (tapd8ac1c56-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:31 np0005466012 NetworkManager[51207]: <info>  [1759408351.1389] device (tapd8ac1c56-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.141 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef in datapath 85403d18-6694-4dbd-a0e0-84ca3f268b89 bound to our chassis#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.145 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85403d18-6694-4dbd-a0e0-84ca3f268b89#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.164 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[97028db5-d9c5-45a8-b129-63c645f6c959]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.166 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85403d18-61 in ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.169 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85403d18-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.170 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7800df4c-94c8-496f-8c10-1d168342e45d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.171 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf03eec-926d-40b5-b807-9adae65d618c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 systemd-machined[152114]: New machine qemu-67-instance-00000090.
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.185 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[094a09a0-5130-4138-b3f9-8bcc98716d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:31Z|00563|binding|INFO|Setting lport d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef ovn-installed in OVS
Oct  2 08:32:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:31Z|00564|binding|INFO|Setting lport d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef up in Southbound
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 systemd[1]: Started Virtual Machine qemu-67-instance-00000090.
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.217 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cba462f9-2926-4ea5-917f-250ee46a4b18]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.245 2 DEBUG nova.network.neutron [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updated VIF entry in instance network info cache for port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.245 2 DEBUG nova.network.neutron [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.259 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c862a91c-6b3d-4cb4-8598-73d558ee3c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.265 2 DEBUG oslo_concurrency.lockutils [req-e4a4a9f2-7534-4288-a9e3-440d0a996f31 req-aee6b7ba-f5ff-403d-bb80-3613c2660507 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:31 np0005466012 NetworkManager[51207]: <info>  [1759408351.2682] manager: (tap85403d18-60): new Veth device (/org/freedesktop/NetworkManager/Devices/258)
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.267 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5e9db7-8c66-43f5-b2b1-1a94b5d09bf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.309 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[28b57b46-8814-4e82-b83f-42140b47b7af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.314 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5189b436-6a4f-4d6c-a528-e559b1f14a91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 NetworkManager[51207]: <info>  [1759408351.3382] device (tap85403d18-60): carrier: link connected
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.345 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[0180ca83-4405-41ed-959d-da4d2aee6810]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.365 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[215d08b4-eda6-4118-bc39-132e7db4149d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85403d18-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:8c:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634495, 'reachable_time': 34711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243659, 'error': None, 'target': 'ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.386 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f1eda766-f5be-4a0c-928d-16be25d0553e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:8ccc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 634495, 'tstamp': 634495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243660, 'error': None, 'target': 'ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.407 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b81678-a66f-4ce8-891d-71a4c61d67d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85403d18-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:8c:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634495, 'reachable_time': 34711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243661, 'error': None, 'target': 'ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.447 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08eae47b-fd2e-4890-924a-7e748cabd1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.525 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[754edabb-c522-408e-b1c4-ce3a14be6ad3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.526 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85403d18-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.526 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.527 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85403d18-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:31 np0005466012 kernel: tap85403d18-60: entered promiscuous mode
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 NetworkManager[51207]: <info>  [1759408351.5295] manager: (tap85403d18-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.533 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85403d18-60, col_values=(('external_ids', {'iface-id': 'f289bd59-801e-4956-8d1d-588879a7fa08'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:31 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:31Z|00565|binding|INFO|Releasing lport f289bd59-801e-4956-8d1d-588879a7fa08 from this chassis (sb_readonly=0)
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 nova_compute[192063]: 2025-10-02 12:32:31.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.549 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85403d18-6694-4dbd-a0e0-84ca3f268b89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85403d18-6694-4dbd-a0e0-84ca3f268b89.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.551 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cc472dac-9ea7-4359-915f-bc8d30a2a7b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.553 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-85403d18-6694-4dbd-a0e0-84ca3f268b89
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/85403d18-6694-4dbd-a0e0-84ca3f268b89.pid.haproxy
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 85403d18-6694-4dbd-a0e0-84ca3f268b89
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:31.554 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'env', 'PROCESS_TAG=haproxy-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85403d18-6694-4dbd-a0e0-84ca3f268b89.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:31 np0005466012 podman[243701]: 2025-10-02 12:32:31.949629691 +0000 UTC m=+0.058575387 container create cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  2 08:32:31 np0005466012 systemd[1]: Started libpod-conmon-cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03.scope.
Oct  2 08:32:32 np0005466012 podman[243701]: 2025-10-02 12:32:31.913601692 +0000 UTC m=+0.022547398 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:32 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:32:32 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a04e56565efa1d444148fb9906b1a4afa5bb3f749734ad3acab5bcfd7a546b8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.036 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408352.0360074, 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.038 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:32 np0005466012 podman[243701]: 2025-10-02 12:32:32.045519414 +0000 UTC m=+0.154465110 container init cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:32:32 np0005466012 podman[243701]: 2025-10-02 12:32:32.052318292 +0000 UTC m=+0.161263978 container start cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:32:32 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [NOTICE]   (243720) : New worker (243722) forked
Oct  2 08:32:32 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [NOTICE]   (243720) : Loading success.
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.082 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.086 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408352.0363858, 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.087 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.113 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.118 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.147 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:32 np0005466012 nova_compute[192063]: 2025-10-02 12:32:32.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:33 np0005466012 nova_compute[192063]: 2025-10-02 12:32:33.388 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408338.3868754, e2d02092-9a5f-4575-875b-f7eba9b563db => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:33 np0005466012 nova_compute[192063]: 2025-10-02 12:32:33.388 2 INFO nova.compute.manager [-] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:32:33 np0005466012 nova_compute[192063]: 2025-10-02 12:32:33.460 2 DEBUG nova.compute.manager [None req-20e2b773-74c0-41f3-ad0a-89fd4809aed8 - - - - - -] [instance: e2d02092-9a5f-4575-875b-f7eba9b563db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:33 np0005466012 nova_compute[192063]: 2025-10-02 12:32:33.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:33 np0005466012 nova_compute[192063]: 2025-10-02 12:32:33.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:34 np0005466012 nova_compute[192063]: 2025-10-02 12:32:34.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.524 2 DEBUG nova.compute.manager [req-c93a28c8-5582-4bbc-983a-4c4b7c4b3244 req-6f8956b9-6d09-4447-87b4-6399de03127c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.525 2 DEBUG oslo_concurrency.lockutils [req-c93a28c8-5582-4bbc-983a-4c4b7c4b3244 req-6f8956b9-6d09-4447-87b4-6399de03127c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.525 2 DEBUG oslo_concurrency.lockutils [req-c93a28c8-5582-4bbc-983a-4c4b7c4b3244 req-6f8956b9-6d09-4447-87b4-6399de03127c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.525 2 DEBUG oslo_concurrency.lockutils [req-c93a28c8-5582-4bbc-983a-4c4b7c4b3244 req-6f8956b9-6d09-4447-87b4-6399de03127c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.525 2 DEBUG nova.compute.manager [req-c93a28c8-5582-4bbc-983a-4c4b7c4b3244 req-6f8956b9-6d09-4447-87b4-6399de03127c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Processing event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.526 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.530 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408355.5296795, 1aa209df-0181-4837-8968-a256ec63b072 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.530 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.533 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.537 2 INFO nova.virt.libvirt.driver [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Instance spawned successfully.#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.537 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.891 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.899 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.902 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.903 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.903 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.904 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.904 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.905 2 DEBUG nova.virt.libvirt.driver [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:35 np0005466012 nova_compute[192063]: 2025-10-02 12:32:35.979 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:36 np0005466012 nova_compute[192063]: 2025-10-02 12:32:36.289 2 INFO nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Took 11.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:36 np0005466012 nova_compute[192063]: 2025-10-02 12:32:36.290 2 DEBUG nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:36 np0005466012 nova_compute[192063]: 2025-10-02 12:32:36.637 2 INFO nova.compute.manager [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Took 12.32 seconds to build instance.#033[00m
Oct  2 08:32:36 np0005466012 nova_compute[192063]: 2025-10-02 12:32:36.709 2 DEBUG oslo_concurrency.lockutils [None req-ae87e6c1-e2b8-4ecb-bb1c-6453943e702f 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.661 2 DEBUG nova.compute.manager [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.662 2 DEBUG oslo_concurrency.lockutils [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.662 2 DEBUG oslo_concurrency.lockutils [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.662 2 DEBUG oslo_concurrency.lockutils [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.662 2 DEBUG nova.compute.manager [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] No waiting events found dispatching network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.663 2 WARNING nova.compute.manager [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received unexpected event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.663 2 DEBUG nova.compute.manager [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.663 2 DEBUG oslo_concurrency.lockutils [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.664 2 DEBUG oslo_concurrency.lockutils [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.664 2 DEBUG oslo_concurrency.lockutils [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.664 2 DEBUG nova.compute.manager [req-f840f7a7-ff47-4fba-ab55-2ff82d495f3c req-7f32eb4d-66ef-45ec-9afc-a62d1506d9ac 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Processing event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.665 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.677 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.678 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408357.6775255, 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.678 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.684 2 INFO nova.virt.libvirt.driver [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Instance spawned successfully.#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.685 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.759 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.762 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.763 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.764 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.764 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.765 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.765 2 DEBUG nova.virt.libvirt.driver [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.769 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.834 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.967 2 INFO nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Took 12.74 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:37 np0005466012 nova_compute[192063]: 2025-10-02 12:32:37.968 2 DEBUG nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:38 np0005466012 nova_compute[192063]: 2025-10-02 12:32:38.182 2 INFO nova.compute.manager [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Took 13.78 seconds to build instance.#033[00m
Oct  2 08:32:38 np0005466012 nova_compute[192063]: 2025-10-02 12:32:38.235 2 DEBUG oslo_concurrency.lockutils [None req-563f3bda-509a-4a42-a474-ae5be22080d6 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:38 np0005466012 nova_compute[192063]: 2025-10-02 12:32:38.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:38 np0005466012 nova_compute[192063]: 2025-10-02 12:32:38.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.832 2 DEBUG nova.compute.manager [req-897006d1-129d-43e2-bcba-7532731c7689 req-97cc59ee-c84c-4b66-8674-9720811fa006 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.833 2 DEBUG oslo_concurrency.lockutils [req-897006d1-129d-43e2-bcba-7532731c7689 req-97cc59ee-c84c-4b66-8674-9720811fa006 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.834 2 DEBUG oslo_concurrency.lockutils [req-897006d1-129d-43e2-bcba-7532731c7689 req-97cc59ee-c84c-4b66-8674-9720811fa006 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.834 2 DEBUG oslo_concurrency.lockutils [req-897006d1-129d-43e2-bcba-7532731c7689 req-97cc59ee-c84c-4b66-8674-9720811fa006 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.834 2 DEBUG nova.compute.manager [req-897006d1-129d-43e2-bcba-7532731c7689 req-97cc59ee-c84c-4b66-8674-9720811fa006 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.834 2 WARNING nova.compute.manager [req-897006d1-129d-43e2-bcba-7532731c7689 req-97cc59ee-c84c-4b66-8674-9720811fa006 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:39 np0005466012 nova_compute[192063]: 2025-10-02 12:32:39.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.741 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.742 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.742 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.742 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.743 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.892 2 INFO nova.compute.manager [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Terminating instance#033[00m
Oct  2 08:32:40 np0005466012 nova_compute[192063]: 2025-10-02 12:32:40.958 2 DEBUG nova.compute.manager [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:32:40 np0005466012 kernel: tapa0f7121e-c5 (unregistering): left promiscuous mode
Oct  2 08:32:40 np0005466012 NetworkManager[51207]: <info>  [1759408360.9852] device (tapa0f7121e-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:32:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:41Z|00566|binding|INFO|Releasing lport a0f7121e-c583-4afe-887a-d9ceec5a6d1d from this chassis (sb_readonly=0)
Oct  2 08:32:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:41Z|00567|binding|INFO|Setting lport a0f7121e-c583-4afe-887a-d9ceec5a6d1d down in Southbound
Oct  2 08:32:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:41Z|00568|binding|INFO|Removing iface tapa0f7121e-c5 ovn-installed in OVS
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.033 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:13:a2 10.100.0.14'], port_security=['fa:16:3e:5b:13:a2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1aa209df-0181-4837-8968-a256ec63b072', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=a0f7121e-c583-4afe-887a-d9ceec5a6d1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.035 103246 INFO neutron.agent.ovn.metadata.agent [-] Port a0f7121e-c583-4afe-887a-d9ceec5a6d1d in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 unbound from our chassis#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.037 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.038 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[40bd2bfb-f21e-4b0a-9ad1-cc7be79a8bdd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.038 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace which is not needed anymore#033[00m
Oct  2 08:32:41 np0005466012 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Oct  2 08:32:41 np0005466012 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008f.scope: Consumed 6.181s CPU time.
Oct  2 08:32:41 np0005466012 systemd-machined[152114]: Machine qemu-66-instance-0000008f terminated.
Oct  2 08:32:41 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [NOTICE]   (243613) : haproxy version is 2.8.14-c23fe91
Oct  2 08:32:41 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [NOTICE]   (243613) : path to executable is /usr/sbin/haproxy
Oct  2 08:32:41 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [WARNING]  (243613) : Exiting Master process...
Oct  2 08:32:41 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [ALERT]    (243613) : Current worker (243615) exited with code 143 (Terminated)
Oct  2 08:32:41 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[243608]: [WARNING]  (243613) : All workers exited. Exiting... (0)
Oct  2 08:32:41 np0005466012 systemd[1]: libpod-e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf.scope: Deactivated successfully.
Oct  2 08:32:41 np0005466012 podman[243758]: 2025-10-02 12:32:41.223118474 +0000 UTC m=+0.060967794 container died e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.230 2 INFO nova.virt.libvirt.driver [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Instance destroyed successfully.#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.231 2 DEBUG nova.objects.instance [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'resources' on Instance uuid 1aa209df-0181-4837-8968-a256ec63b072 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:41 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf-userdata-shm.mount: Deactivated successfully.
Oct  2 08:32:41 np0005466012 systemd[1]: var-lib-containers-storage-overlay-9ff94ca9cb6a788bd860fbee35ce495d7e256ac2973b016e3fc5fc8bc935f662-merged.mount: Deactivated successfully.
Oct  2 08:32:41 np0005466012 podman[243758]: 2025-10-02 12:32:41.278174132 +0000 UTC m=+0.116023432 container cleanup e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.287 2 DEBUG nova.virt.libvirt.vif [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1923059675',display_name='tempest-ServersTestJSON-server-1923059675',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1923059675',id=143,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDYy+9w7bvJVR5J7fWNcydO3VDnx5LcnjPGAgITqqsdwswVtis+rcBpRPey6+U2Rdm147agicWwcfmfB4o9tNv/0rkHSQ56vechz7NDqp4/fw8ZMQghoM4Bx5Kij0C4B3g==',key_name='tempest-key-684128559',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-gmt9rv0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:36Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=1aa209df-0181-4837-8968-a256ec63b072,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.288 2 DEBUG nova.network.os_vif_util [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "address": "fa:16:3e:5b:13:a2", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f7121e-c5", "ovs_interfaceid": "a0f7121e-c583-4afe-887a-d9ceec5a6d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.289 2 DEBUG nova.network.os_vif_util [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.289 2 DEBUG os_vif [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.291 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0f7121e-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 systemd[1]: libpod-conmon-e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf.scope: Deactivated successfully.
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.298 2 INFO os_vif [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:13:a2,bridge_name='br-int',has_traffic_filtering=True,id=a0f7121e-c583-4afe-887a-d9ceec5a6d1d,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f7121e-c5')#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.299 2 INFO nova.virt.libvirt.driver [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Deleting instance files /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072_del#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.300 2 INFO nova.virt.libvirt.driver [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Deletion of /var/lib/nova/instances/1aa209df-0181-4837-8968-a256ec63b072_del complete#033[00m
Oct  2 08:32:41 np0005466012 podman[243787]: 2025-10-02 12:32:41.319012676 +0000 UTC m=+0.068946235 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:32:41 np0005466012 podman[243824]: 2025-10-02 12:32:41.359826379 +0000 UTC m=+0.053022003 container remove e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.364 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[276296f4-7083-487c-b795-295478d525f6]: (4, ('Thu Oct  2 12:32:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf)\ne86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf\nThu Oct  2 12:32:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (e86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf)\ne86f19f3def9ab950a518d841bccb1378469176631d90229c6284d0960082faf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.366 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[84e0d047-126f-4e12-bdaf-3cd01b7dd13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.367 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 kernel: tap1acf42c5-00: left promiscuous mode
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.381 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea69180-de3f-4bcb-a486-2a4aae3ffe02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:41 np0005466012 podman[243794]: 2025-10-02 12:32:41.398806311 +0000 UTC m=+0.144015439 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.415 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c692b475-dae6-4c4b-9ae3-c3e4d6ced690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.416 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9184b90a-a5eb-4ff3-8f27-30fa9b093fa5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.434 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6d54b89f-7501-4c76-9de4-51269f74de8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634353, 'reachable_time': 22462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243861, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 systemd[1]: run-netns-ovnmeta\x2d1acf42c5\x2d084c\x2d4cc4\x2dbdc5\x2d910eec0249e3.mount: Deactivated successfully.
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.436 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:32:41 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:41.436 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[d6014ce0-6e85-4282-a50b-43d14b69d39d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.563 2 INFO nova.compute.manager [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.563 2 DEBUG oslo.service.loopingcall [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.564 2 DEBUG nova.compute.manager [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.564 2 DEBUG nova.network.neutron [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:32:41 np0005466012 nova_compute[192063]: 2025-10-02 12:32:41.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.049 2 DEBUG nova.compute.manager [req-3705c5a0-3965-480b-9ac2-9184a9c28271 req-e00dfbb7-16ce-490c-990a-5e6790a884e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-vif-unplugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.050 2 DEBUG oslo_concurrency.lockutils [req-3705c5a0-3965-480b-9ac2-9184a9c28271 req-e00dfbb7-16ce-490c-990a-5e6790a884e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.051 2 DEBUG oslo_concurrency.lockutils [req-3705c5a0-3965-480b-9ac2-9184a9c28271 req-e00dfbb7-16ce-490c-990a-5e6790a884e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.051 2 DEBUG oslo_concurrency.lockutils [req-3705c5a0-3965-480b-9ac2-9184a9c28271 req-e00dfbb7-16ce-490c-990a-5e6790a884e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.052 2 DEBUG nova.compute.manager [req-3705c5a0-3965-480b-9ac2-9184a9c28271 req-e00dfbb7-16ce-490c-990a-5e6790a884e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] No waiting events found dispatching network-vif-unplugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.052 2 DEBUG nova.compute.manager [req-3705c5a0-3965-480b-9ac2-9184a9c28271 req-e00dfbb7-16ce-490c-990a-5e6790a884e8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-vif-unplugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:42 np0005466012 NetworkManager[51207]: <info>  [1759408362.4165] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Oct  2 08:32:42 np0005466012 NetworkManager[51207]: <info>  [1759408362.4176] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:42Z|00569|binding|INFO|Releasing lport f289bd59-801e-4956-8d1d-588879a7fa08 from this chassis (sb_readonly=0)
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.837 2 DEBUG nova.network.neutron [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.874 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.875 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.875 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.876 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:32:42 np0005466012 nova_compute[192063]: 2025-10-02 12:32:42.913 2 INFO nova.compute.manager [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Took 1.35 seconds to deallocate network for instance.#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.075 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.141 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.143 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.203 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.215 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.216 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.330 2 DEBUG nova.compute.provider_tree [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.348 2 DEBUG nova.scheduler.client.report [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.372 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.406 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.408 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5550MB free_disk=73.24234771728516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.408 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.409 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.413 2 INFO nova.scheduler.client.report [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Deleted allocations for instance 1aa209df-0181-4837-8968-a256ec63b072#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.736 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.737 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.737 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.741 2 DEBUG oslo_concurrency.lockutils [None req-0f291407-32c5-401b-9814-40284605246d 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.749 2 DEBUG nova.compute.manager [req-87143fbb-6885-462f-a8ed-59d7f0bd3c17 req-26c2845f-4840-4f47-91e2-82ab7d4eb062 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-vif-deleted-a0f7121e-c583-4afe-887a-d9ceec5a6d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.810 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.834 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.858 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:32:43 np0005466012 nova_compute[192063]: 2025-10-02 12:32:43.859 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:44 np0005466012 podman[243872]: 2025-10-02 12:32:44.141578454 +0000 UTC m=+0.054019122 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:32:44 np0005466012 podman[243871]: 2025-10-02 12:32:44.157405412 +0000 UTC m=+0.069585903 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.189 2 DEBUG nova.compute.manager [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.190 2 DEBUG oslo_concurrency.lockutils [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1aa209df-0181-4837-8968-a256ec63b072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.190 2 DEBUG oslo_concurrency.lockutils [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.191 2 DEBUG oslo_concurrency.lockutils [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1aa209df-0181-4837-8968-a256ec63b072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.191 2 DEBUG nova.compute.manager [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] No waiting events found dispatching network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.192 2 WARNING nova.compute.manager [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Received unexpected event network-vif-plugged-a0f7121e-c583-4afe-887a-d9ceec5a6d1d for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.192 2 DEBUG nova.compute.manager [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-changed-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.192 2 DEBUG nova.compute.manager [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing instance network info cache due to event network-changed-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.193 2 DEBUG oslo_concurrency.lockutils [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.193 2 DEBUG oslo_concurrency.lockutils [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.193 2 DEBUG nova.network.neutron [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing network info cache for port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.860 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.861 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.880 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:32:44 np0005466012 nova_compute[192063]: 2025-10-02 12:32:44.882 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:32:45 np0005466012 nova_compute[192063]: 2025-10-02 12:32:45.913 2 DEBUG nova.network.neutron [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updated VIF entry in instance network info cache for port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:45 np0005466012 nova_compute[192063]: 2025-10-02 12:32:45.914 2 DEBUG nova.network.neutron [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:45 np0005466012 nova_compute[192063]: 2025-10-02 12:32:45.940 2 DEBUG oslo_concurrency.lockutils [req-0365d337-b3ac-4d2b-b19b-0c86a7ec92de req-24b5a3e3-c032-40c6-8fdd-2a1a033de32b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:46 np0005466012 nova_compute[192063]: 2025-10-02 12:32:46.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.456 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.458 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.481 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.598 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.599 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.607 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.607 2 INFO nova.compute.claims [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.791 2 DEBUG nova.compute.provider_tree [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.807 2 DEBUG nova.scheduler.client.report [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.827 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.828 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.892 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.893 2 DEBUG nova.network.neutron [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.919 2 INFO nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:32:47 np0005466012 nova_compute[192063]: 2025-10-02 12:32:47.939 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.078 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.079 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.080 2 INFO nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Creating image(s)#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.081 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "/var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.081 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.082 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.103 2 DEBUG nova.policy [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.107 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.192 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.193 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.194 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.205 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.271 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.272 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.335 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.337 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.337 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.401 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.402 2 DEBUG nova.virt.disk.api [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Checking if we can resize image /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.402 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.471 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.473 2 DEBUG nova.virt.disk.api [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Cannot resize image /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.473 2 DEBUG nova.objects.instance [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e9408e2-5973-4e12-b904-711ea96bfbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.762 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.763 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Ensure instance console log exists: /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.763 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.764 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:48 np0005466012 nova_compute[192063]: 2025-10-02 12:32:48.765 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:49 np0005466012 nova_compute[192063]: 2025-10-02 12:32:49.196 2 DEBUG nova.network.neutron [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Successfully created port: 16f2cb9f-304c-467e-aee4-6fb2ab771415 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:32:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:49Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:af:b2 10.100.0.12
Oct  2 08:32:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:49Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:af:b2 10.100.0.12
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.843 2 DEBUG nova.network.neutron [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Successfully updated port: 16f2cb9f-304c-467e-aee4-6fb2ab771415 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.876 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "refresh_cache-9e9408e2-5973-4e12-b904-711ea96bfbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.877 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquired lock "refresh_cache-9e9408e2-5973-4e12-b904-711ea96bfbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.877 2 DEBUG nova.network.neutron [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.947 2 DEBUG nova.compute.manager [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received event network-changed-16f2cb9f-304c-467e-aee4-6fb2ab771415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.948 2 DEBUG nova.compute.manager [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Refreshing instance network info cache due to event network-changed-16f2cb9f-304c-467e-aee4-6fb2ab771415. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:32:50 np0005466012 nova_compute[192063]: 2025-10-02 12:32:50.948 2 DEBUG oslo_concurrency.lockutils [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9e9408e2-5973-4e12-b904-711ea96bfbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:32:51 np0005466012 nova_compute[192063]: 2025-10-02 12:32:51.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:52 np0005466012 nova_compute[192063]: 2025-10-02 12:32:52.056 2 DEBUG nova.network.neutron [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:32:52 np0005466012 nova_compute[192063]: 2025-10-02 12:32:52.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.904 2 DEBUG nova.network.neutron [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Updating instance_info_cache with network_info: [{"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.926 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Releasing lock "refresh_cache-9e9408e2-5973-4e12-b904-711ea96bfbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.927 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Instance network_info: |[{"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.928 2 DEBUG oslo_concurrency.lockutils [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9e9408e2-5973-4e12-b904-711ea96bfbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.928 2 DEBUG nova.network.neutron [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Refreshing network info cache for port 16f2cb9f-304c-467e-aee4-6fb2ab771415 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.931 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Start _get_guest_xml network_info=[{"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.936 2 WARNING nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.941 2 DEBUG nova.virt.libvirt.host [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.942 2 DEBUG nova.virt.libvirt.host [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.945 2 DEBUG nova.virt.libvirt.host [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.946 2 DEBUG nova.virt.libvirt.host [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.947 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.947 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.948 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.948 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.948 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.949 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.949 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.949 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.950 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.950 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.950 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.950 2 DEBUG nova.virt.hardware [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.954 2 DEBUG nova.virt.libvirt.vif [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1562978230',display_name='tempest-ServersTestJSON-server-1562978230',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1562978230',id=146,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-zygia2a5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:47Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=9e9408e2-5973-4e12-b904-711ea96bfbb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.955 2 DEBUG nova.network.os_vif_util [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.956 2 DEBUG nova.network.os_vif_util [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.959 2 DEBUG nova.objects.instance [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e9408e2-5973-4e12-b904-711ea96bfbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.976 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <uuid>9e9408e2-5973-4e12-b904-711ea96bfbb2</uuid>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <name>instance-00000092</name>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersTestJSON-server-1562978230</nova:name>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:32:53</nova:creationTime>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:user uuid="27daa263abb54d4d8e3ae34cd1c5ccf5">tempest-ServersTestJSON-1163535506-project-member</nova:user>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:project uuid="a4a7099974504a798e1607c8e6a1f570">tempest-ServersTestJSON-1163535506</nova:project>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        <nova:port uuid="16f2cb9f-304c-467e-aee4-6fb2ab771415">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <entry name="serial">9e9408e2-5973-4e12-b904-711ea96bfbb2</entry>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <entry name="uuid">9e9408e2-5973-4e12-b904-711ea96bfbb2</entry>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.config"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:2a:5a:b4"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <target dev="tap16f2cb9f-30"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/console.log" append="off"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:32:53 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:32:53 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:32:53 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:32:53 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.977 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Preparing to wait for external event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.978 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.978 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.978 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.979 2 DEBUG nova.virt.libvirt.vif [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:32:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1562978230',display_name='tempest-ServersTestJSON-server-1562978230',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1562978230',id=146,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-zygia2a5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:32:47Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=9e9408e2-5973-4e12-b904-711ea96bfbb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.979 2 DEBUG nova.network.os_vif_util [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.980 2 DEBUG nova.network.os_vif_util [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.980 2 DEBUG os_vif [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.981 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.985 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16f2cb9f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap16f2cb9f-30, col_values=(('external_ids', {'iface-id': '16f2cb9f-304c-467e-aee4-6fb2ab771415', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:5a:b4', 'vm-uuid': '9e9408e2-5973-4e12-b904-711ea96bfbb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:53 np0005466012 NetworkManager[51207]: <info>  [1759408373.9889] manager: (tap16f2cb9f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:53 np0005466012 nova_compute[192063]: 2025-10-02 12:32:53.997 2 INFO os_vif [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30')#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.046 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.046 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.047 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No VIF found with MAC fa:16:3e:2a:5a:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.047 2 INFO nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Using config drive#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.806 2 INFO nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Creating config drive at /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.config#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.815 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp99b81sq0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:32:54 np0005466012 nova_compute[192063]: 2025-10-02 12:32:54.948 2 DEBUG oslo_concurrency.processutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp99b81sq0" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:32:55 np0005466012 kernel: tap16f2cb9f-30: entered promiscuous mode
Oct  2 08:32:55 np0005466012 NetworkManager[51207]: <info>  [1759408375.0357] manager: (tap16f2cb9f-30): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:55Z|00570|binding|INFO|Claiming lport 16f2cb9f-304c-467e-aee4-6fb2ab771415 for this chassis.
Oct  2 08:32:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:55Z|00571|binding|INFO|16f2cb9f-304c-467e-aee4-6fb2ab771415: Claiming fa:16:3e:2a:5a:b4 10.100.0.8
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.044 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:5a:b4 10.100.0.8'], port_security=['fa:16:3e:2a:5a:b4 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9e9408e2-5973-4e12-b904-711ea96bfbb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=16f2cb9f-304c-467e-aee4-6fb2ab771415) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.045 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 16f2cb9f-304c-467e-aee4-6fb2ab771415 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 bound to our chassis#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.046 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:32:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:55Z|00572|binding|INFO|Setting lport 16f2cb9f-304c-467e-aee4-6fb2ab771415 ovn-installed in OVS
Oct  2 08:32:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:55Z|00573|binding|INFO|Setting lport 16f2cb9f-304c-467e-aee4-6fb2ab771415 up in Southbound
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.060 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[83aec72b-206e-4837-b00a-2eb4fd9aabcc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.061 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1acf42c5-01 in ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.062 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1acf42c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.062 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e64cf1db-910f-4164-98d7-d85754b1b379]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.065 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[883802aa-9fa7-43dc-a89f-24998a12267c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.079 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[80bc3a08-6682-4f78-8816-67dc0f118245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 systemd-udevd[243987]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:32:55 np0005466012 systemd-machined[152114]: New machine qemu-68-instance-00000092.
Oct  2 08:32:55 np0005466012 NetworkManager[51207]: <info>  [1759408375.0969] device (tap16f2cb9f-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:32:55 np0005466012 NetworkManager[51207]: <info>  [1759408375.0976] device (tap16f2cb9f-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:32:55 np0005466012 systemd[1]: Started Virtual Machine qemu-68-instance-00000092.
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.105 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc4e234-a214-456e-ba51-3b91d79a0e30]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 podman[243953]: 2025-10-02 12:32:55.127621526 +0000 UTC m=+0.099576614 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.132 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fef89a53-f1fa-4af5-8ae8-b2374baa19cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.137 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[976894b4-2573-45b0-80ab-aff00f2274aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 NetworkManager[51207]: <info>  [1759408375.1386] manager: (tap1acf42c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Oct  2 08:32:55 np0005466012 podman[243954]: 2025-10-02 12:32:55.146603754 +0000 UTC m=+0.107357302 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.166 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9befae57-7cbd-4a45-bd2d-a3dcda1d4168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.169 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b35dcab7-48e3-483d-83fc-73c5dd9d6daf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 NetworkManager[51207]: <info>  [1759408375.1909] device (tap1acf42c5-00): carrier: link connected
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.194 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd4196f-b573-4f31-869d-76bd98a1fc89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.209 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1999f473-9953-4542-95d4-84b350351337]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 176], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 636880, 'reachable_time': 15990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244031, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.222 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cb67b118-d79e-4fa8-b1f8-0d2531c0bd6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:5bcd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 636880, 'tstamp': 636880}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244032, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.238 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[daa4c212-775d-4210-bdec-7b497cd39f30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 176], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 636880, 'reachable_time': 15990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244033, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.273 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[30ff4b06-26f6-4d10-bef4-bb794becc108]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.348 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0963ae52-9b08-4f99-a612-6f4739d4b8a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 kernel: tap1acf42c5-00: entered promiscuous mode
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 NetworkManager[51207]: <info>  [1759408375.3525] manager: (tap1acf42c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.349 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.349 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.350 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.355 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:32:55Z|00574|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.358 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.359 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[22c80f94-61b3-4398-af0f-ca93fc5cb024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.360 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:32:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:32:55.360 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'env', 'PROCESS_TAG=haproxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1acf42c5-084c-4cc4-bdc5-910eec0249e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.633 2 INFO nova.compute.manager [None req-6c66786e-6398-49d6-b5e8-c587bb4868d8 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Get console output#033[00m
Oct  2 08:32:55 np0005466012 nova_compute[192063]: 2025-10-02 12:32:55.643 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:32:55 np0005466012 podman[244065]: 2025-10-02 12:32:55.767181621 +0000 UTC m=+0.094463643 container create f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:32:55 np0005466012 podman[244065]: 2025-10-02 12:32:55.693987069 +0000 UTC m=+0.021269101 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:32:55 np0005466012 systemd[1]: Started libpod-conmon-f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27.scope.
Oct  2 08:32:55 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:32:55 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89ce635e3935eff182bb0d19ef8ae53a4327ec765bece1fc0a32e3b9041a78df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:32:55 np0005466012 podman[244065]: 2025-10-02 12:32:55.861497549 +0000 UTC m=+0.188779581 container init f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:32:55 np0005466012 podman[244065]: 2025-10-02 12:32:55.866047396 +0000 UTC m=+0.193329388 container start f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:32:55 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [NOTICE]   (244084) : New worker (244086) forked
Oct  2 08:32:55 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [NOTICE]   (244084) : Loading success.
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.160 2 DEBUG nova.compute.manager [req-b478328e-5ecc-44e6-b1ea-04ebdff14296 req-9c4f90cf-2d32-456e-98f2-81f1416b9e64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.161 2 DEBUG oslo_concurrency.lockutils [req-b478328e-5ecc-44e6-b1ea-04ebdff14296 req-9c4f90cf-2d32-456e-98f2-81f1416b9e64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.161 2 DEBUG oslo_concurrency.lockutils [req-b478328e-5ecc-44e6-b1ea-04ebdff14296 req-9c4f90cf-2d32-456e-98f2-81f1416b9e64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.162 2 DEBUG oslo_concurrency.lockutils [req-b478328e-5ecc-44e6-b1ea-04ebdff14296 req-9c4f90cf-2d32-456e-98f2-81f1416b9e64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.162 2 DEBUG nova.compute.manager [req-b478328e-5ecc-44e6-b1ea-04ebdff14296 req-9c4f90cf-2d32-456e-98f2-81f1416b9e64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Processing event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.217 2 DEBUG nova.network.neutron [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Updated VIF entry in instance network info cache for port 16f2cb9f-304c-467e-aee4-6fb2ab771415. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.218 2 DEBUG nova.network.neutron [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Updating instance_info_cache with network_info: [{"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.226 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408361.2243822, 1aa209df-0181-4837-8968-a256ec63b072 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.226 2 INFO nova.compute.manager [-] [instance: 1aa209df-0181-4837-8968-a256ec63b072] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.232 2 DEBUG oslo_concurrency.lockutils [req-3eabf721-9506-40d0-965c-06e38e142a8b req-5454e6b2-5563-42bf-a381-aef16b457bfe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9e9408e2-5973-4e12-b904-711ea96bfbb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.243 2 DEBUG nova.compute.manager [None req-b999a7fd-47b7-42c6-86f5-3cdfef146257 - - - - - -] [instance: 1aa209df-0181-4837-8968-a256ec63b072] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.382 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408376.3821082, 9e9408e2-5973-4e12-b904-711ea96bfbb2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.383 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.385 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.388 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.391 2 INFO nova.virt.libvirt.driver [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Instance spawned successfully.#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.391 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.423 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.427 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.435 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.436 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.436 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.437 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.437 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.438 2 DEBUG nova.virt.libvirt.driver [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.447 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.447 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408376.3822143, 9e9408e2-5973-4e12-b904-711ea96bfbb2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.448 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.480 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.484 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408376.3878212, 9e9408e2-5973-4e12-b904-711ea96bfbb2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.484 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.524 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.528 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.545 2 INFO nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Took 8.47 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.546 2 DEBUG nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.553 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.642 2 INFO nova.compute.manager [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Took 9.09 seconds to build instance.#033[00m
Oct  2 08:32:56 np0005466012 nova_compute[192063]: 2025-10-02 12:32:56.661 2 DEBUG oslo_concurrency.lockutils [None req-298ff680-ab8c-489d-a4be-8bcec3e71f4c 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:57 np0005466012 podman[244103]: 2025-10-02 12:32:57.13379446 +0000 UTC m=+0.045048212 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:32:57 np0005466012 podman[244102]: 2025-10-02 12:32:57.150714889 +0000 UTC m=+0.060731996 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:32:57 np0005466012 nova_compute[192063]: 2025-10-02 12:32:57.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.248 2 DEBUG nova.compute.manager [req-f40f8e00-88e6-4bec-8ee3-5bb90501c720 req-884a6036-6be8-4d60-b118-e2a485bbcad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.249 2 DEBUG oslo_concurrency.lockutils [req-f40f8e00-88e6-4bec-8ee3-5bb90501c720 req-884a6036-6be8-4d60-b118-e2a485bbcad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.250 2 DEBUG oslo_concurrency.lockutils [req-f40f8e00-88e6-4bec-8ee3-5bb90501c720 req-884a6036-6be8-4d60-b118-e2a485bbcad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.250 2 DEBUG oslo_concurrency.lockutils [req-f40f8e00-88e6-4bec-8ee3-5bb90501c720 req-884a6036-6be8-4d60-b118-e2a485bbcad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.250 2 DEBUG nova.compute.manager [req-f40f8e00-88e6-4bec-8ee3-5bb90501c720 req-884a6036-6be8-4d60-b118-e2a485bbcad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] No waiting events found dispatching network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.250 2 WARNING nova.compute.manager [req-f40f8e00-88e6-4bec-8ee3-5bb90501c720 req-884a6036-6be8-4d60-b118-e2a485bbcad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received unexpected event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:32:58 np0005466012 nova_compute[192063]: 2025-10-02 12:32:58.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:01 np0005466012 nova_compute[192063]: 2025-10-02 12:33:01.037 2 DEBUG oslo_concurrency.lockutils [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "interface-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:01 np0005466012 nova_compute[192063]: 2025-10-02 12:33:01.042 2 DEBUG oslo_concurrency.lockutils [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:01 np0005466012 nova_compute[192063]: 2025-10-02 12:33:01.043 2 DEBUG nova.objects.instance [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'flavor' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:01 np0005466012 nova_compute[192063]: 2025-10-02 12:33:01.395 2 DEBUG nova.objects.instance [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:01 np0005466012 nova_compute[192063]: 2025-10-02 12:33:01.414 2 DEBUG nova.network.neutron [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:33:01 np0005466012 nova_compute[192063]: 2025-10-02 12:33:01.928 2 DEBUG nova.policy [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:33:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:02.147 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:02.148 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:02.149 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:02 np0005466012 nova_compute[192063]: 2025-10-02 12:33:02.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:02 np0005466012 nova_compute[192063]: 2025-10-02 12:33:02.939 2 DEBUG nova.network.neutron [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Successfully created port: 41116686-5286-4561-953c-ea09c7785a04 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:03 np0005466012 nova_compute[192063]: 2025-10-02 12:33:03.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.104 2 DEBUG nova.network.neutron [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Successfully updated port: 41116686-5286-4561-953c-ea09c7785a04 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.120 2 DEBUG oslo_concurrency.lockutils [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.120 2 DEBUG oslo_concurrency.lockutils [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.121 2 DEBUG nova.network.neutron [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.217 2 DEBUG nova.compute.manager [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-changed-41116686-5286-4561-953c-ea09c7785a04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.218 2 DEBUG nova.compute.manager [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing instance network info cache due to event network-changed-41116686-5286-4561-953c-ea09c7785a04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:04 np0005466012 nova_compute[192063]: 2025-10-02 12:33:04.219 2 DEBUG oslo_concurrency.lockutils [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.338 2 DEBUG nova.network.neutron [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.377 2 DEBUG oslo_concurrency.lockutils [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.378 2 DEBUG oslo_concurrency.lockutils [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.378 2 DEBUG nova.network.neutron [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing network info cache for port 41116686-5286-4561-953c-ea09c7785a04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.381 2 DEBUG nova.virt.libvirt.vif [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.382 2 DEBUG nova.network.os_vif_util [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.383 2 DEBUG nova.network.os_vif_util [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.383 2 DEBUG os_vif [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.384 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.385 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41116686-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.389 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap41116686-52, col_values=(('external_ids', {'iface-id': '41116686-5286-4561-953c-ea09c7785a04', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:ec:45', 'vm-uuid': '4b2b0338-e64b-41eb-8902-3d7a95c6ffb1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.3924] manager: (tap41116686-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.404 2 INFO os_vif [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52')#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.405 2 DEBUG nova.virt.libvirt.vif [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.405 2 DEBUG nova.network.os_vif_util [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.406 2 DEBUG nova.network.os_vif_util [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.410 2 DEBUG nova.virt.libvirt.guest [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] attach device xml: <interface type="ethernet">
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:45:ec:45"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <target dev="tap41116686-52"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:33:06 np0005466012 nova_compute[192063]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  2 08:33:06 np0005466012 kernel: tap41116686-52: entered promiscuous mode
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.4245] manager: (tap41116686-52): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:06Z|00575|binding|INFO|Claiming lport 41116686-5286-4561-953c-ea09c7785a04 for this chassis.
Oct  2 08:33:06 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:06Z|00576|binding|INFO|41116686-5286-4561-953c-ea09c7785a04: Claiming fa:16:3e:45:ec:45 10.100.0.28
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.438 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:ec:45 10.100.0.28'], port_security=['fa:16:3e:45:ec:45 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '4b2b0338-e64b-41eb-8902-3d7a95c6ffb1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44d3fec7-3557-4638-b491-fade5377689b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1aab0b39-6daf-41d1-a7da-b7bb077ff5e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4daa1b5a-b32b-47bb-9692-e7597d2ee21f, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=41116686-5286-4561-953c-ea09c7785a04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.439 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 41116686-5286-4561-953c-ea09c7785a04 in datapath 44d3fec7-3557-4638-b491-fade5377689b bound to our chassis#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.440 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44d3fec7-3557-4638-b491-fade5377689b#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.452 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[864fc436-ec12-489b-a1b7-60573a75df8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.453 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44d3fec7-31 in ovnmeta-44d3fec7-3557-4638-b491-fade5377689b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:33:06 np0005466012 systemd-udevd[244150]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.455 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44d3fec7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.455 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd74f14-5207-4ad1-9908-94eb06aa35c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.457 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[67e3b1af-502c-4a29-9ddc-9fe611cfedfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.4715] device (tap41116686-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.4730] device (tap41116686-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:06 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:06Z|00577|binding|INFO|Setting lport 41116686-5286-4561-953c-ea09c7785a04 up in Southbound
Oct  2 08:33:06 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:06Z|00578|binding|INFO|Setting lport 41116686-5286-4561-953c-ea09c7785a04 ovn-installed in OVS
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.470 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[56865539-0871-416e-9dc1-f81f2b7d355a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.490 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[39bf7e66-2577-4202-aa0d-c8527e938431]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.527 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8a0f20-4f2d-4774-be32-51a0dbbd711e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.5322] manager: (tap44d3fec7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.531 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[440d2f84-4ca9-4633-a798-2e1136c20b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.553 2 DEBUG nova.virt.libvirt.driver [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.553 2 DEBUG nova.virt.libvirt.driver [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.554 2 DEBUG nova.virt.libvirt.driver [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:0e:af:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.554 2 DEBUG nova.virt.libvirt.driver [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:45:ec:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.568 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e25c3f12-73ef-4d07-9d93-46bcf5e7b287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.574 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ca000793-71f0-4629-b0cb-4f09d80f1d37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.5963] device (tap44d3fec7-30): carrier: link connected
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.599 2 DEBUG nova.virt.libvirt.guest [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:06</nova:creationTime>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:06 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    <nova:port uuid="41116686-5286-4561-953c-ea09c7785a04">
Oct  2 08:33:06 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:06 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:06 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:06 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.607 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9432f8-e478-400f-bd24-4f2329281d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.624 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[73ac1354-41a8-4516-b3e0-62bb96dfad2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44d3fec7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:52:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638020, 'reachable_time': 25611, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244176, 'error': None, 'target': 'ovnmeta-44d3fec7-3557-4638-b491-fade5377689b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.640 2 DEBUG oslo_concurrency.lockutils [None req-0cc1c354-9b83-495c-bf6f-038f49520f19 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.640 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1b2cfbc-a5a4-4118-81fc-28f0d22082d8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:52d3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638020, 'tstamp': 638020}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244177, 'error': None, 'target': 'ovnmeta-44d3fec7-3557-4638-b491-fade5377689b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.659 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[48b7c345-b9dd-4e14-bfde-77e1d04c336a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44d3fec7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:52:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638020, 'reachable_time': 25611, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244178, 'error': None, 'target': 'ovnmeta-44d3fec7-3557-4638-b491-fade5377689b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.692 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d994b78c-31a2-43d2-bbfe-66df27917685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.753 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ec4bbee-724c-4d47-9d4a-f36e339db383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.755 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44d3fec7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.755 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.755 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44d3fec7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:06 np0005466012 kernel: tap44d3fec7-30: entered promiscuous mode
Oct  2 08:33:06 np0005466012 NetworkManager[51207]: <info>  [1759408386.7592] manager: (tap44d3fec7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.761 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44d3fec7-30, col_values=(('external_ids', {'iface-id': '18aec6b6-c713-4d47-a438-c1bf0bf4d42a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:06 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:06Z|00579|binding|INFO|Releasing lport 18aec6b6-c713-4d47-a438-c1bf0bf4d42a from this chassis (sb_readonly=0)
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.779 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44d3fec7-3557-4638-b491-fade5377689b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44d3fec7-3557-4638-b491-fade5377689b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:33:06 np0005466012 nova_compute[192063]: 2025-10-02 12:33:06.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.780 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1299fd4a-00f8-4f32-ae74-37bc8b37d12e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.781 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-44d3fec7-3557-4638-b491-fade5377689b
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/44d3fec7-3557-4638-b491-fade5377689b.pid.haproxy
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 44d3fec7-3557-4638-b491-fade5377689b
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:33:06 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:06.782 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44d3fec7-3557-4638-b491-fade5377689b', 'env', 'PROCESS_TAG=haproxy-44d3fec7-3557-4638-b491-fade5377689b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44d3fec7-3557-4638-b491-fade5377689b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:33:07 np0005466012 podman[244210]: 2025-10-02 12:33:07.173119802 +0000 UTC m=+0.052893960 container create 8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:33:07 np0005466012 systemd[1]: Started libpod-conmon-8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce.scope.
Oct  2 08:33:07 np0005466012 podman[244210]: 2025-10-02 12:33:07.146484592 +0000 UTC m=+0.026258770 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:33:07 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:33:07 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/416fd636e733be72dd9965c364fc87b60ac934f55de8d842d4280acb53e43d42/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:33:07 np0005466012 podman[244210]: 2025-10-02 12:33:07.276728258 +0000 UTC m=+0.156502436 container init 8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:33:07 np0005466012 podman[244210]: 2025-10-02 12:33:07.288666529 +0000 UTC m=+0.168440687 container start 8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:33:07 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [NOTICE]   (244229) : New worker (244231) forked
Oct  2 08:33:07 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [NOTICE]   (244229) : Loading success.
Oct  2 08:33:07 np0005466012 nova_compute[192063]: 2025-10-02 12:33:07.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:07Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:ec:45 10.100.0.28
Oct  2 08:33:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:07Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:ec:45 10.100.0.28
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.089 2 DEBUG nova.network.neutron [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updated VIF entry in instance network info cache for port 41116686-5286-4561-953c-ea09c7785a04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.090 2 DEBUG nova.network.neutron [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.117 2 DEBUG oslo_concurrency.lockutils [req-17267a1d-5c99-47a4-8ffa-f5531fff1777 req-d8639e20-1a6f-4707-bb7e-eaf1bf280e8b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.341 2 DEBUG oslo_concurrency.lockutils [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "interface-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-41116686-5286-4561-953c-ea09c7785a04" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.342 2 DEBUG oslo_concurrency.lockutils [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-41116686-5286-4561-953c-ea09c7785a04" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.373 2 DEBUG nova.objects.instance [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'flavor' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.399 2 DEBUG nova.virt.libvirt.vif [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.400 2 DEBUG nova.network.os_vif_util [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.400 2 DEBUG nova.network.os_vif_util [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.403 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.405 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.407 2 DEBUG nova.virt.libvirt.driver [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Attempting to detach device tap41116686-52 from instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.408 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:45:ec:45"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <target dev="tap41116686-52"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.417 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.420 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface>not found in domain: <domain type='kvm' id='67'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <name>instance-00000090</name>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <uuid>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</uuid>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:06</nova:creationTime>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:port uuid="41116686-5286-4561-953c-ea09c7785a04">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='serial'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='uuid'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk' index='2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config' index='1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:0e:af:b2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='tapd8ac1c56-cb'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:45:ec:45'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='tap41116686-52'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='net1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c488,c547</label>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c488,c547</imagelabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.420 2 INFO nova.virt.libvirt.driver [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully detached device tap41116686-52 from instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 from the persistent domain config.#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.421 2 DEBUG nova.virt.libvirt.driver [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] (1/8): Attempting to detach device tap41116686-52 with device alias net1 from instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.421 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] detach device xml: <interface type="ethernet">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <mac address="fa:16:3e:45:ec:45"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <model type="virtio"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <mtu size="1442"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <target dev="tap41116686-52"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </interface>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  2 08:33:08 np0005466012 kernel: tap41116686-52 (unregistering): left promiscuous mode
Oct  2 08:33:08 np0005466012 NetworkManager[51207]: <info>  [1759408388.5418] device (tap41116686-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:08Z|00580|binding|INFO|Releasing lport 41116686-5286-4561-953c-ea09c7785a04 from this chassis (sb_readonly=0)
Oct  2 08:33:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:08Z|00581|binding|INFO|Setting lport 41116686-5286-4561-953c-ea09c7785a04 down in Southbound
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:08Z|00582|binding|INFO|Removing iface tap41116686-52 ovn-installed in OVS
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.558 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:ec:45 10.100.0.28'], port_security=['fa:16:3e:45:ec:45 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '4b2b0338-e64b-41eb-8902-3d7a95c6ffb1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44d3fec7-3557-4638-b491-fade5377689b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1aab0b39-6daf-41d1-a7da-b7bb077ff5e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4daa1b5a-b32b-47bb-9692-e7597d2ee21f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=41116686-5286-4561-953c-ea09c7785a04) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.559 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 41116686-5286-4561-953c-ea09c7785a04 in datapath 44d3fec7-3557-4638-b491-fade5377689b unbound from our chassis#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.560 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44d3fec7-3557-4638-b491-fade5377689b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.562 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b399074f-798b-44b4-9922-0433beac78a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.563 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44d3fec7-3557-4638-b491-fade5377689b namespace which is not needed anymore#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.563 2 DEBUG nova.virt.libvirt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Received event <DeviceRemovedEvent: 1759408388.5633638, 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.564 2 DEBUG nova.virt.libvirt.driver [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Start waiting for the detach event from libvirt for device tap41116686-52 with device alias net1 for instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.565 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.567 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface>not found in domain: <domain type='kvm' id='67'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <name>instance-00000090</name>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <uuid>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</uuid>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:06</nova:creationTime>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:port uuid="41116686-5286-4561-953c-ea09c7785a04">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='serial'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='uuid'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk' index='2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config' index='1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:0e:af:b2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target dev='tapd8ac1c56-cb'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c488,c547</label>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c488,c547</imagelabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.567 2 INFO nova.virt.libvirt.driver [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully detached device tap41116686-52 from instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 from the live domain config.#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.568 2 DEBUG nova.virt.libvirt.vif [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.569 2 DEBUG nova.network.os_vif_util [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.569 2 DEBUG nova.network.os_vif_util [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.570 2 DEBUG os_vif [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.572 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41116686-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.578 2 INFO os_vif [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52')#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.579 2 DEBUG nova.virt.libvirt.guest [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:08</nova:creationTime>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:08 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:08 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:08 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:33:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:08Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2a:5a:b4 10.100.0.8
Oct  2 08:33:08 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:08Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2a:5a:b4 10.100.0.8
Oct  2 08:33:08 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [NOTICE]   (244229) : haproxy version is 2.8.14-c23fe91
Oct  2 08:33:08 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [NOTICE]   (244229) : path to executable is /usr/sbin/haproxy
Oct  2 08:33:08 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [WARNING]  (244229) : Exiting Master process...
Oct  2 08:33:08 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [ALERT]    (244229) : Current worker (244231) exited with code 143 (Terminated)
Oct  2 08:33:08 np0005466012 neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b[244225]: [WARNING]  (244229) : All workers exited. Exiting... (0)
Oct  2 08:33:08 np0005466012 systemd[1]: libpod-8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce.scope: Deactivated successfully.
Oct  2 08:33:08 np0005466012 podman[244276]: 2025-10-02 12:33:08.68921061 +0000 UTC m=+0.043346504 container died 8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:33:08 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce-userdata-shm.mount: Deactivated successfully.
Oct  2 08:33:08 np0005466012 systemd[1]: var-lib-containers-storage-overlay-416fd636e733be72dd9965c364fc87b60ac934f55de8d842d4280acb53e43d42-merged.mount: Deactivated successfully.
Oct  2 08:33:08 np0005466012 podman[244276]: 2025-10-02 12:33:08.725428065 +0000 UTC m=+0.079563959 container cleanup 8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:33:08 np0005466012 systemd[1]: libpod-conmon-8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce.scope: Deactivated successfully.
Oct  2 08:33:08 np0005466012 podman[244307]: 2025-10-02 12:33:08.77708859 +0000 UTC m=+0.034026186 container remove 8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.781 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[900d4763-9231-48c8-ba2c-66a5cd3f296b]: (4, ('Thu Oct  2 12:33:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b (8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce)\n8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce\nThu Oct  2 12:33:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-44d3fec7-3557-4638-b491-fade5377689b (8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce)\n8f9648de474a2c6681ad2794f37b23ab95393985a21b9bf50e247cf7c55861ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.783 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdb33f9-a378-439c-a6ca-bef707837c71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.783 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44d3fec7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 kernel: tap44d3fec7-30: left promiscuous mode
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.789 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6f939b47-2440-4ccc-ac05-392e5bb2ed07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.809 2 DEBUG nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.810 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.810 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.810 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.810 2 DEBUG nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.811 2 WARNING nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.811 2 DEBUG nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.811 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.811 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.812 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.812 2 DEBUG nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.812 2 WARNING nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.812 2 DEBUG nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-unplugged-41116686-5286-4561-953c-ea09c7785a04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.812 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.813 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.813 2 DEBUG oslo_concurrency.lockutils [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.813 2 DEBUG nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-unplugged-41116686-5286-4561-953c-ea09c7785a04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:08 np0005466012 nova_compute[192063]: 2025-10-02 12:33:08.813 2 WARNING nova.compute.manager [req-8daf494c-ab84-48c5-9e00-b878e595ece7 req-1eab3db3-c2bd-404b-8b62-a11efa6e8a4c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-unplugged-41116686-5286-4561-953c-ea09c7785a04 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.830 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2892da9a-3cc9-4327-92f9-69448e5de4d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.831 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb6377a-b182-4307-b4bd-03fa33b3f997]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.845 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ffb93ad7-d641-49bd-8179-88943d338157]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638013, 'reachable_time': 38850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244322, 'error': None, 'target': 'ovnmeta-44d3fec7-3557-4638-b491-fade5377689b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.848 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44d3fec7-3557-4638-b491-fade5377689b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:33:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:08.848 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[5f03abb0-d970-457c-8843-4f5262289367]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:08 np0005466012 systemd[1]: run-netns-ovnmeta\x2d44d3fec7\x2d3557\x2d4638\x2db491\x2dfade5377689b.mount: Deactivated successfully.
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.699 2 DEBUG oslo_concurrency.lockutils [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.700 2 DEBUG oslo_concurrency.lockutils [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.700 2 DEBUG nova.network.neutron [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.783 2 DEBUG nova.compute.manager [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-deleted-41116686-5286-4561-953c-ea09c7785a04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.784 2 INFO nova.compute.manager [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Neutron deleted interface 41116686-5286-4561-953c-ea09c7785a04; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.784 2 DEBUG nova.network.neutron [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.803 2 DEBUG nova.objects.instance [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'system_metadata' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.830 2 DEBUG nova.objects.instance [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lazy-loading 'flavor' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.847 2 DEBUG nova.virt.libvirt.vif [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.847 2 DEBUG nova.network.os_vif_util [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.848 2 DEBUG nova.network.os_vif_util [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.852 2 DEBUG nova.virt.libvirt.guest [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.855 2 DEBUG nova.virt.libvirt.guest [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface>not found in domain: <domain type='kvm' id='67'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <name>instance-00000090</name>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <uuid>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</uuid>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:08</nova:creationTime>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='serial'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='uuid'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk' index='2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config' index='1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:0e:af:b2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target dev='tapd8ac1c56-cb'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c488,c547</label>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c488,c547</imagelabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.855 2 DEBUG nova.virt.libvirt.guest [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.860 2 DEBUG nova.virt.libvirt.guest [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:45:ec:45"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41116686-52"/></interface>not found in domain: <domain type='kvm' id='67'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <name>instance-00000090</name>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <uuid>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</uuid>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:08</nova:creationTime>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <memory unit='KiB'>131072</memory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <vcpu placement='static'>1</vcpu>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <resource>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <partition>/machine</partition>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </resource>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <sysinfo type='smbios'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='manufacturer'>RDO</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='product'>OpenStack Compute</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='serial'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='uuid'>4b2b0338-e64b-41eb-8902-3d7a95c6ffb1</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <entry name='family'>Virtual Machine</entry>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <boot dev='hd'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <smbios mode='sysinfo'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <vmcoreinfo state='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <cpu mode='custom' match='exact' check='full'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <model fallback='forbid'>Nehalem</model>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='x2apic'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='hypervisor'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <feature policy='require' name='vme'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <clock offset='utc'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <timer name='pit' tickpolicy='delay'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <timer name='hpet' present='no'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <on_poweroff>destroy</on_poweroff>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <on_reboot>restart</on_reboot>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <on_crash>destroy</on_crash>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <disk type='file' device='disk'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk' index='2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <backingStore type='file' index='3'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <format type='raw'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <source file='/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <backingStore/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      </backingStore>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target dev='vda' bus='virtio'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='virtio-disk0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <disk type='file' device='cdrom'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <driver name='qemu' type='raw' cache='none'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/disk.config' index='1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <backingStore/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target dev='sda' bus='sata'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <readonly/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='sata0-0-0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='0' model='pcie-root'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pcie.0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='1' port='0x10'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='2' port='0x11'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='3' port='0x12'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='4' port='0x13'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='5' port='0x14'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='6' port='0x15'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='7' port='0x16'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='8' port='0x17'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.8'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='9' port='0x18'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.9'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='10' port='0x19'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.10'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='11' port='0x1a'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.11'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='12' port='0x1b'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.12'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='13' port='0x1c'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.13'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='14' port='0x1d'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.14'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='15' port='0x1e'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.15'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='16' port='0x1f'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.16'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='17' port='0x20'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.17'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='18' port='0x21'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.18'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='19' port='0x22'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.19'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='20' port='0x23'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.20'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='21' port='0x24'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.21'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='22' port='0x25'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.22'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='23' port='0x26'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.23'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='24' port='0x27'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.24'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-root-port'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target chassis='25' port='0x28'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.25'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model name='pcie-pci-bridge'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='pci.26'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='usb'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <controller type='sata' index='0'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='ide'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </controller>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <interface type='ethernet'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <mac address='fa:16:3e:0e:af:b2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target dev='tapd8ac1c56-cb'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model type='virtio'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <driver name='vhost' rx_queue_size='512'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <mtu size='1442'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='net0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <serial type='pty'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target type='isa-serial' port='0'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:        <model name='isa-serial'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      </target>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <console type='pty' tty='/dev/pts/1'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <source path='/dev/pts/1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <log file='/var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1/console.log' append='off'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <target type='serial' port='0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='serial0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </console>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <input type='tablet' bus='usb'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='input0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='usb' bus='0' port='1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <input type='mouse' bus='ps2'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='input1'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <input type='keyboard' bus='ps2'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='input2'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </input>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <listen type='address' address='::0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </graphics>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <audio id='1' type='none'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <model type='virtio' heads='1' primary='yes'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='video0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <watchdog model='itco' action='reset'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='watchdog0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </watchdog>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <memballoon model='virtio'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <stats period='10'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='balloon0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <rng model='virtio'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <backend model='random'>/dev/urandom</backend>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <alias name='rng0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <label>system_u:system_r:svirt_t:s0:c488,c547</label>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c488,c547</imagelabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <label>+107:+107</label>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <imagelabel>+107:+107</imagelabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </seclabel>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.860 2 WARNING nova.virt.libvirt.driver [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Detaching interface fa:16:3e:45:ec:45 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap41116686-52' not found.#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.861 2 DEBUG nova.virt.libvirt.vif [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.861 2 DEBUG nova.network.os_vif_util [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converting VIF {"id": "41116686-5286-4561-953c-ea09c7785a04", "address": "fa:16:3e:45:ec:45", "network": {"id": "44d3fec7-3557-4638-b491-fade5377689b", "bridge": "br-int", "label": "tempest-network-smoke--1750158549", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41116686-52", "ovs_interfaceid": "41116686-5286-4561-953c-ea09c7785a04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.862 2 DEBUG nova.network.os_vif_util [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.863 2 DEBUG os_vif [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41116686-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.867 2 INFO os_vif [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:ec:45,bridge_name='br-int',has_traffic_filtering=True,id=41116686-5286-4561-953c-ea09c7785a04,network=Network(44d3fec7-3557-4638-b491-fade5377689b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41116686-52')#033[00m
Oct  2 08:33:09 np0005466012 nova_compute[192063]: 2025-10-02 12:33:09.868 2 DEBUG nova.virt.libvirt.guest [req-205e6dee-36b5-43b1-bd57-514f8f3facd3 req-3444e31c-12c8-4bd1-aaca-62c4697e99d0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:name>tempest-TestNetworkBasicOps-server-812719307</nova:name>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:creationTime>2025-10-02 12:33:09</nova:creationTime>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:flavor name="m1.nano">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:memory>128</nova:memory>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:disk>1</nova:disk>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:swap>0</nova:swap>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:flavor>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:owner>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:owner>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  <nova:ports>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    <nova:port uuid="d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef">
Oct  2 08:33:09 np0005466012 nova_compute[192063]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:    </nova:port>
Oct  2 08:33:09 np0005466012 nova_compute[192063]:  </nova:ports>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: </nova:instance>
Oct  2 08:33:09 np0005466012 nova_compute[192063]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  2 08:33:10 np0005466012 nova_compute[192063]: 2025-10-02 12:33:10.939 2 DEBUG nova.compute.manager [req-8519a6e4-6aaa-4157-9737-730d9f8b47f8 req-2a811a20-dade-46f5-bd2d-b18cc1711180 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:10 np0005466012 nova_compute[192063]: 2025-10-02 12:33:10.939 2 DEBUG oslo_concurrency.lockutils [req-8519a6e4-6aaa-4157-9737-730d9f8b47f8 req-2a811a20-dade-46f5-bd2d-b18cc1711180 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:10 np0005466012 nova_compute[192063]: 2025-10-02 12:33:10.940 2 DEBUG oslo_concurrency.lockutils [req-8519a6e4-6aaa-4157-9737-730d9f8b47f8 req-2a811a20-dade-46f5-bd2d-b18cc1711180 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:10 np0005466012 nova_compute[192063]: 2025-10-02 12:33:10.940 2 DEBUG oslo_concurrency.lockutils [req-8519a6e4-6aaa-4157-9737-730d9f8b47f8 req-2a811a20-dade-46f5-bd2d-b18cc1711180 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:10 np0005466012 nova_compute[192063]: 2025-10-02 12:33:10.940 2 DEBUG nova.compute.manager [req-8519a6e4-6aaa-4157-9737-730d9f8b47f8 req-2a811a20-dade-46f5-bd2d-b18cc1711180 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:10 np0005466012 nova_compute[192063]: 2025-10-02 12:33:10.941 2 WARNING nova.compute.manager [req-8519a6e4-6aaa-4157-9737-730d9f8b47f8 req-2a811a20-dade-46f5-bd2d-b18cc1711180 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-plugged-41116686-5286-4561-953c-ea09c7785a04 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:11.087 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:11.088 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:33:11 np0005466012 nova_compute[192063]: 2025-10-02 12:33:11.098 2 INFO nova.network.neutron [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Port 41116686-5286-4561-953c-ea09c7785a04 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:33:11 np0005466012 nova_compute[192063]: 2025-10-02 12:33:11.098 2 DEBUG nova.network.neutron [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:11 np0005466012 nova_compute[192063]: 2025-10-02 12:33:11.124 2 DEBUG oslo_concurrency.lockutils [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:11 np0005466012 nova_compute[192063]: 2025-10-02 12:33:11.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:11 np0005466012 nova_compute[192063]: 2025-10-02 12:33:11.155 2 DEBUG oslo_concurrency.lockutils [None req-5c0e3dde-89d0-4546-8e91-4bdbf719a98c a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "interface-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-41116686-5286-4561-953c-ea09c7785a04" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:11Z|00583|binding|INFO|Releasing lport f289bd59-801e-4956-8d1d-588879a7fa08 from this chassis (sb_readonly=0)
Oct  2 08:33:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:11Z|00584|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:33:11 np0005466012 nova_compute[192063]: 2025-10-02 12:33:11.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 podman[244323]: 2025-10-02 12:33:12.155130357 +0000 UTC m=+0.065759866 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:33:12 np0005466012 podman[244324]: 2025-10-02 12:33:12.224904304 +0000 UTC m=+0.123852079 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.349 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.350 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.350 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.350 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.350 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.362 2 INFO nova.compute.manager [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Terminating instance#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.376 2 DEBUG nova.compute.manager [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:33:12 np0005466012 kernel: tap16f2cb9f-30 (unregistering): left promiscuous mode
Oct  2 08:33:12 np0005466012 NetworkManager[51207]: <info>  [1759408392.4077] device (tap16f2cb9f-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00585|binding|INFO|Releasing lport 16f2cb9f-304c-467e-aee4-6fb2ab771415 from this chassis (sb_readonly=0)
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00586|binding|INFO|Setting lport 16f2cb9f-304c-467e-aee4-6fb2ab771415 down in Southbound
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00587|binding|INFO|Removing iface tap16f2cb9f-30 ovn-installed in OVS
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.428 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:5a:b4 10.100.0.8'], port_security=['fa:16:3e:2a:5a:b4 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9e9408e2-5973-4e12-b904-711ea96bfbb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=16f2cb9f-304c-467e-aee4-6fb2ab771415) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.429 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 16f2cb9f-304c-467e-aee4-6fb2ab771415 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 unbound from our chassis#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.430 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.431 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5e0bdb-d0b6-40cc-bc23-ae9f9380396b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.432 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace which is not needed anymore#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.454 2 DEBUG nova.compute.manager [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-changed-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.454 2 DEBUG nova.compute.manager [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing instance network info cache due to event network-changed-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.455 2 DEBUG oslo_concurrency.lockutils [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.455 2 DEBUG oslo_concurrency.lockutils [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.456 2 DEBUG nova.network.neutron [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Refreshing network info cache for port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:12 np0005466012 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000092.scope: Deactivated successfully.
Oct  2 08:33:12 np0005466012 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000092.scope: Consumed 13.631s CPU time.
Oct  2 08:33:12 np0005466012 systemd-machined[152114]: Machine qemu-68-instance-00000092 terminated.
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [NOTICE]   (244084) : haproxy version is 2.8.14-c23fe91
Oct  2 08:33:12 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [NOTICE]   (244084) : path to executable is /usr/sbin/haproxy
Oct  2 08:33:12 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [WARNING]  (244084) : Exiting Master process...
Oct  2 08:33:12 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [ALERT]    (244084) : Current worker (244086) exited with code 143 (Terminated)
Oct  2 08:33:12 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244080]: [WARNING]  (244084) : All workers exited. Exiting... (0)
Oct  2 08:33:12 np0005466012 systemd[1]: libpod-f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27.scope: Deactivated successfully.
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 podman[244398]: 2025-10-02 12:33:12.604247556 +0000 UTC m=+0.056042697 container died f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27-userdata-shm.mount: Deactivated successfully.
Oct  2 08:33:12 np0005466012 systemd[1]: var-lib-containers-storage-overlay-89ce635e3935eff182bb0d19ef8ae53a4327ec765bece1fc0a32e3b9041a78df-merged.mount: Deactivated successfully.
Oct  2 08:33:12 np0005466012 podman[244398]: 2025-10-02 12:33:12.644876934 +0000 UTC m=+0.096672105 container cleanup f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.654 2 INFO nova.virt.libvirt.driver [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Instance destroyed successfully.#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.654 2 DEBUG nova.objects.instance [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'resources' on Instance uuid 9e9408e2-5973-4e12-b904-711ea96bfbb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.669 2 DEBUG nova.virt.libvirt.vif [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1562978230',display_name='tempest-ServersTestJSON-server-1562978230',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1562978230',id=146,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-zygia2a5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:56Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=9e9408e2-5973-4e12-b904-711ea96bfbb2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.670 2 DEBUG nova.network.os_vif_util [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "address": "fa:16:3e:2a:5a:b4", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16f2cb9f-30", "ovs_interfaceid": "16f2cb9f-304c-467e-aee4-6fb2ab771415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.671 2 DEBUG nova.network.os_vif_util [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.671 2 DEBUG os_vif [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.673 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16f2cb9f-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.682 2 INFO os_vif [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5a:b4,bridge_name='br-int',has_traffic_filtering=True,id=16f2cb9f-304c-467e-aee4-6fb2ab771415,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16f2cb9f-30')#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.683 2 INFO nova.virt.libvirt.driver [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Deleting instance files /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2_del#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.684 2 INFO nova.virt.libvirt.driver [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Deletion of /var/lib/nova/instances/9e9408e2-5973-4e12-b904-711ea96bfbb2_del complete#033[00m
Oct  2 08:33:12 np0005466012 systemd[1]: libpod-conmon-f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27.scope: Deactivated successfully.
Oct  2 08:33:12 np0005466012 podman[244443]: 2025-10-02 12:33:12.73120352 +0000 UTC m=+0.044529617 container remove f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.737 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[98ace291-2f12-4077-b872-5f89200b34d9]: (4, ('Thu Oct  2 12:33:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27)\nf4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27\nThu Oct  2 12:33:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (f4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27)\nf4d62410fe4759dcb8771ff3e0961c0cbe25be57ab465f284f5a1c038f9ffd27\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.739 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bc71a6be-1fac-4782-b91b-8575c76b92f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.739 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 kernel: tap1acf42c5-00: left promiscuous mode
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.748 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[10124b6d-0ae0-40bd-af93-6f2d54f72005]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.758 2 INFO nova.compute.manager [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.760 2 DEBUG oslo.service.loopingcall [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.762 2 DEBUG nova.compute.manager [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.762 2 DEBUG nova.network.neutron [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.772 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8f6d3e6f-a514-4ebf-ac6d-21505018d9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.773 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d397dffb-0573-4329-9148-44067237c674]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.793 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[42cbcfab-55f2-4502-a85f-9a2413e67d37]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 636873, 'reachable_time': 16163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244458, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 systemd[1]: run-netns-ovnmeta\x2d1acf42c5\x2d084c\x2d4cc4\x2dbdc5\x2d910eec0249e3.mount: Deactivated successfully.
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.795 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.796 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[8847334d-4c53-482e-a514-c733378d7af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.806 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.806 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.806 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.807 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.807 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.818 2 INFO nova.compute.manager [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Terminating instance#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.830 2 DEBUG nova.compute.manager [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:33:12 np0005466012 kernel: tapd8ac1c56-cb (unregistering): left promiscuous mode
Oct  2 08:33:12 np0005466012 NetworkManager[51207]: <info>  [1759408392.8601] device (tapd8ac1c56-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00588|binding|INFO|Releasing lport d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef from this chassis (sb_readonly=0)
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00589|binding|INFO|Setting lport d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef down in Southbound
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00590|binding|INFO|Removing iface tapd8ac1c56-cb ovn-installed in OVS
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:12Z|00591|binding|INFO|Releasing lport f289bd59-801e-4956-8d1d-588879a7fa08 from this chassis (sb_readonly=0)
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.880 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:af:b2 10.100.0.12'], port_security=['fa:16:3e:0e:af:b2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4b2b0338-e64b-41eb-8902-3d7a95c6ffb1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b5ef964e-9316-47d6-a3f7-a6731e9e6be2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd8169e5-966f-4f69-9e79-03a4ed7aea2e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.881 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef in datapath 85403d18-6694-4dbd-a0e0-84ca3f268b89 unbound from our chassis#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.882 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85403d18-6694-4dbd-a0e0-84ca3f268b89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.883 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[607444f5-dae4-4e80-8244-82d2fb9a20ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:12.884 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89 namespace which is not needed anymore#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 nova_compute[192063]: 2025-10-02 12:33:12.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:12 np0005466012 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000090.scope: Deactivated successfully.
Oct  2 08:33:12 np0005466012 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000090.scope: Consumed 14.267s CPU time.
Oct  2 08:33:12 np0005466012 systemd-machined[152114]: Machine qemu-67-instance-00000090 terminated.
Oct  2 08:33:13 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [NOTICE]   (243720) : haproxy version is 2.8.14-c23fe91
Oct  2 08:33:13 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [NOTICE]   (243720) : path to executable is /usr/sbin/haproxy
Oct  2 08:33:13 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [WARNING]  (243720) : Exiting Master process...
Oct  2 08:33:13 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [ALERT]    (243720) : Current worker (243722) exited with code 143 (Terminated)
Oct  2 08:33:13 np0005466012 neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89[243716]: [WARNING]  (243720) : All workers exited. Exiting... (0)
Oct  2 08:33:13 np0005466012 systemd[1]: libpod-cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03.scope: Deactivated successfully.
Oct  2 08:33:13 np0005466012 NetworkManager[51207]: <info>  [1759408393.0537] manager: (tapd8ac1c56-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Oct  2 08:33:13 np0005466012 podman[244481]: 2025-10-02 12:33:13.056320596 +0000 UTC m=+0.060750938 container died cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:33:13 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:13Z|00592|binding|INFO|Releasing lport f289bd59-801e-4956-8d1d-588879a7fa08 from this chassis (sb_readonly=0)
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03-userdata-shm.mount: Deactivated successfully.
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.090 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:13 np0005466012 systemd[1]: var-lib-containers-storage-overlay-7a04e56565efa1d444148fb9906b1a4afa5bb3f749734ad3acab5bcfd7a546b8-merged.mount: Deactivated successfully.
Oct  2 08:33:13 np0005466012 podman[244481]: 2025-10-02 12:33:13.095941495 +0000 UTC m=+0.100371847 container cleanup cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.103 2 INFO nova.virt.libvirt.driver [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Instance destroyed successfully.#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.103 2 DEBUG nova.objects.instance [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'resources' on Instance uuid 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:13 np0005466012 systemd[1]: libpod-conmon-cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03.scope: Deactivated successfully.
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.120 2 DEBUG nova.virt.libvirt.vif [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:32:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-812719307',display_name='tempest-TestNetworkBasicOps-server-812719307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-812719307',id=144,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPzOvNpPXvTmQKsObLW2frDaB/UcqyWCcFsS1qvDRXz/cgV1ShZV9OTafu7O/eewVEDnWETTAhCAil29eW434+g1I5APLU43WWEGvJbWxhqswixggtI5hb5OpFrphy6Etg==',key_name='tempest-TestNetworkBasicOps-865732227',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-kot4hpaa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:32:38Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=4b2b0338-e64b-41eb-8902-3d7a95c6ffb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.120 2 DEBUG nova.network.os_vif_util [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.121 2 DEBUG nova.network.os_vif_util [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.121 2 DEBUG os_vif [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8ac1c56-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.128 2 INFO os_vif [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:af:b2,bridge_name='br-int',has_traffic_filtering=True,id=d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef,network=Network(85403d18-6694-4dbd-a0e0-84ca3f268b89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8ac1c56-cb')#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.128 2 INFO nova.virt.libvirt.driver [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Deleting instance files /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1_del#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.129 2 INFO nova.virt.libvirt.driver [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Deletion of /var/lib/nova/instances/4b2b0338-e64b-41eb-8902-3d7a95c6ffb1_del complete#033[00m
Oct  2 08:33:13 np0005466012 podman[244529]: 2025-10-02 12:33:13.175580247 +0000 UTC m=+0.053441325 container remove cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.183 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ae4c9a-defb-4cd4-9b7e-22426732b993]: (4, ('Thu Oct  2 12:33:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89 (cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03)\ncafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03\nThu Oct  2 12:33:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89 (cafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03)\ncafdde712d336d6edb65cdf91ea0da613c4293d1f755fef92f486f350aca3d03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.185 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[41c15d28-4d4e-4a90-b7ac-51c77647cc32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.186 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85403d18-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 kernel: tap85403d18-60: left promiscuous mode
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.209 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[22b11bdf-6eea-4f50-8261-8578c6e24c05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.246 2 INFO nova.compute.manager [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.246 2 DEBUG oslo.service.loopingcall [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.247 2 DEBUG nova.compute.manager [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.247 2 DEBUG nova.network.neutron [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.257 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ea6da0b5-49f0-4f8b-90a3-4c3206f69678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.258 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e1f769-7340-4405-9c4b-548055c45155]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.286 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[635f62a1-c6c2-4f06-a4c6-c59ac40eed48]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634486, 'reachable_time': 15777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244544, 'error': None, 'target': 'ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.288 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85403d18-6694-4dbd-a0e0-84ca3f268b89 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:33:13 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:13.288 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[824b0b40-4aca-4fa2-9def-7d6865bffd06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.594 2 DEBUG nova.network.neutron [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.626 2 INFO nova.compute.manager [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Took 0.86 seconds to deallocate network for instance.#033[00m
Oct  2 08:33:13 np0005466012 systemd[1]: run-netns-ovnmeta\x2d85403d18\x2d6694\x2d4dbd\x2da0e0\x2d84ca3f268b89.mount: Deactivated successfully.
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.690 2 DEBUG nova.compute.manager [req-12fb7cf8-04d7-4d79-adb4-7382d5a7f978 req-ed0b0452-647d-4a3b-8ffa-585823bc06cc 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received event network-vif-deleted-16f2cb9f-304c-467e-aee4-6fb2ab771415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.729 2 DEBUG nova.network.neutron [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updated VIF entry in instance network info cache for port d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.729 2 DEBUG nova.network.neutron [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [{"id": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "address": "fa:16:3e:0e:af:b2", "network": {"id": "85403d18-6694-4dbd-a0e0-84ca3f268b89", "bridge": "br-int", "label": "tempest-network-smoke--1183012299", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8ac1c56-cb", "ovs_interfaceid": "d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.766 2 DEBUG oslo_concurrency.lockutils [req-6a9e0b13-c007-494e-b193-1124f4f726cf req-b3bd21c4-2cb7-4923-bd43-ce6cdf972ba6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.821 2 DEBUG nova.compute.manager [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received event network-vif-unplugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.821 2 DEBUG oslo_concurrency.lockutils [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.822 2 DEBUG oslo_concurrency.lockutils [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.822 2 DEBUG oslo_concurrency.lockutils [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.822 2 DEBUG nova.compute.manager [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] No waiting events found dispatching network-vif-unplugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.822 2 WARNING nova.compute.manager [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received unexpected event network-vif-unplugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.823 2 DEBUG nova.compute.manager [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.823 2 DEBUG oslo_concurrency.lockutils [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.823 2 DEBUG oslo_concurrency.lockutils [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.823 2 DEBUG oslo_concurrency.lockutils [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.824 2 DEBUG nova.compute.manager [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] No waiting events found dispatching network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.824 2 WARNING nova.compute.manager [req-02b2223b-714f-4ff1-be35-adf51b7690fb req-5d3ed3e4-fbcf-44e1-9bc0-47563d37fdb8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Received unexpected event network-vif-plugged-16f2cb9f-304c-467e-aee4-6fb2ab771415 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.826 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.827 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.888 2 DEBUG nova.compute.provider_tree [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.908 2 DEBUG nova.scheduler.client.report [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:13 np0005466012 nova_compute[192063]: 2025-10-02 12:33:13.946 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.036 2 INFO nova.scheduler.client.report [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Deleted allocations for instance 9e9408e2-5973-4e12-b904-711ea96bfbb2#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.120 2 DEBUG nova.network.neutron [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.179 2 INFO nova.compute.manager [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Took 0.93 seconds to deallocate network for instance.#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.248 2 DEBUG oslo_concurrency.lockutils [None req-233333d0-1bb3-4e9d-908f-b7c0a45e21b9 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9e9408e2-5973-4e12-b904-711ea96bfbb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.344 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.344 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.394 2 DEBUG nova.compute.provider_tree [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.410 2 DEBUG nova.scheduler.client.report [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.436 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.471 2 INFO nova.scheduler.client.report [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Deleted allocations for instance 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1#033[00m
Oct  2 08:33:14 np0005466012 nova_compute[192063]: 2025-10-02 12:33:14.568 2 DEBUG oslo_concurrency.lockutils [None req-ebbc4f21-9942-4b04-acbf-a4a0c85b3fd3 a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:15 np0005466012 podman[244546]: 2025-10-02 12:33:15.158559476 +0000 UTC m=+0.067465365 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:33:15 np0005466012 podman[244545]: 2025-10-02 12:33:15.168698827 +0000 UTC m=+0.079398646 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.185 2 DEBUG nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-unplugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.186 2 DEBUG oslo_concurrency.lockutils [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.186 2 DEBUG oslo_concurrency.lockutils [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.186 2 DEBUG oslo_concurrency.lockutils [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.186 2 DEBUG nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-unplugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.187 2 WARNING nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-unplugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.187 2 DEBUG nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.187 2 DEBUG oslo_concurrency.lockutils [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.187 2 DEBUG oslo_concurrency.lockutils [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.187 2 DEBUG oslo_concurrency.lockutils [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "4b2b0338-e64b-41eb-8902-3d7a95c6ffb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.187 2 DEBUG nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] No waiting events found dispatching network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.188 2 WARNING nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received unexpected event network-vif-plugged-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:33:15 np0005466012 nova_compute[192063]: 2025-10-02 12:33:15.188 2 DEBUG nova.compute.manager [req-b30401f5-b5c9-4416-8c30-a16e720618bf req-6b1ea9f5-d622-4030-855e-e5fc08af1194 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Received event network-vif-deleted-d8ac1c56-cba0-4c9f-a1b8-fcde20ac0aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:33:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:33:17 np0005466012 nova_compute[192063]: 2025-10-02 12:33:17.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:18 np0005466012 nova_compute[192063]: 2025-10-02 12:33:18.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.504 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.504 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.526 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.651 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.652 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.660 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.661 2 INFO nova.compute.claims [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.793 2 DEBUG nova.compute.provider_tree [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.809 2 DEBUG nova.scheduler.client.report [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.834 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.835 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.939 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.939 2 DEBUG nova.network.neutron [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.955 2 INFO nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:33:20 np0005466012 nova_compute[192063]: 2025-10-02 12:33:20.973 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.134 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.136 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.136 2 INFO nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Creating image(s)#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.137 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "/var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.137 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.138 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "/var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.154 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.222 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.223 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.224 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.241 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.299 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.300 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.335 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.336 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.337 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.390 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.392 2 DEBUG nova.virt.disk.api [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Checking if we can resize image /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.392 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.445 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.446 2 DEBUG nova.virt.disk.api [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Cannot resize image /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.447 2 DEBUG nova.objects.instance [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'migration_context' on Instance uuid 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.461 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.462 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Ensure instance console log exists: /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.462 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.462 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.463 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:21 np0005466012 nova_compute[192063]: 2025-10-02 12:33:21.949 2 DEBUG nova.policy [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27daa263abb54d4d8e3ae34cd1c5ccf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4a7099974504a798e1607c8e6a1f570', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:33:22 np0005466012 nova_compute[192063]: 2025-10-02 12:33:22.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:23 np0005466012 nova_compute[192063]: 2025-10-02 12:33:23.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:23 np0005466012 nova_compute[192063]: 2025-10-02 12:33:23.143 2 DEBUG nova.network.neutron [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Successfully created port: 15f8dc0f-ac99-40ed-bab7-50fe9de13594 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.424 2 DEBUG nova.network.neutron [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Successfully updated port: 15f8dc0f-ac99-40ed-bab7-50fe9de13594 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.439 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "refresh_cache-9769ee2c-2b6d-451a-a99a-d5f5cb51d643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.440 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquired lock "refresh_cache-9769ee2c-2b6d-451a-a99a-d5f5cb51d643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.440 2 DEBUG nova.network.neutron [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.543 2 DEBUG nova.compute.manager [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received event network-changed-15f8dc0f-ac99-40ed-bab7-50fe9de13594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.544 2 DEBUG nova.compute.manager [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Refreshing instance network info cache due to event network-changed-15f8dc0f-ac99-40ed-bab7-50fe9de13594. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.545 2 DEBUG oslo_concurrency.lockutils [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9769ee2c-2b6d-451a-a99a-d5f5cb51d643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:24 np0005466012 nova_compute[192063]: 2025-10-02 12:33:24.648 2 DEBUG nova.network.neutron [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.488 2 DEBUG nova.network.neutron [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Updating instance_info_cache with network_info: [{"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.608 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Releasing lock "refresh_cache-9769ee2c-2b6d-451a-a99a-d5f5cb51d643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.608 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Instance network_info: |[{"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.609 2 DEBUG oslo_concurrency.lockutils [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9769ee2c-2b6d-451a-a99a-d5f5cb51d643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.609 2 DEBUG nova.network.neutron [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Refreshing network info cache for port 15f8dc0f-ac99-40ed-bab7-50fe9de13594 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.612 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Start _get_guest_xml network_info=[{"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.618 2 WARNING nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.622 2 DEBUG nova.virt.libvirt.host [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.623 2 DEBUG nova.virt.libvirt.host [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.627 2 DEBUG nova.virt.libvirt.host [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.628 2 DEBUG nova.virt.libvirt.host [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.629 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.629 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.630 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.630 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.631 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.631 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.631 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.632 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.632 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.633 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.633 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.635 2 DEBUG nova.virt.hardware [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.639 2 DEBUG nova.virt.libvirt.vif [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1761010569',display_name='tempest-ServersTestJSON-server-1761010569',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1761010569',id=148,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-19om0imj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:21Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=9769ee2c-2b6d-451a-a99a-d5f5cb51d643,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.640 2 DEBUG nova.network.os_vif_util [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.640 2 DEBUG nova.network.os_vif_util [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.641 2 DEBUG nova.objects.instance [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.703 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <uuid>9769ee2c-2b6d-451a-a99a-d5f5cb51d643</uuid>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <name>instance-00000094</name>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:name>tempest-ServersTestJSON-server-1761010569</nova:name>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:33:25</nova:creationTime>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:user uuid="27daa263abb54d4d8e3ae34cd1c5ccf5">tempest-ServersTestJSON-1163535506-project-member</nova:user>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:project uuid="a4a7099974504a798e1607c8e6a1f570">tempest-ServersTestJSON-1163535506</nova:project>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        <nova:port uuid="15f8dc0f-ac99-40ed-bab7-50fe9de13594">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <entry name="serial">9769ee2c-2b6d-451a-a99a-d5f5cb51d643</entry>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <entry name="uuid">9769ee2c-2b6d-451a-a99a-d5f5cb51d643</entry>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.config"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:60:7f:2c"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <target dev="tap15f8dc0f-ac"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/console.log" append="off"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:33:25 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:33:25 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:33:25 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:33:25 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.704 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Preparing to wait for external event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.704 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.704 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.704 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.705 2 DEBUG nova.virt.libvirt.vif [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1761010569',display_name='tempest-ServersTestJSON-server-1761010569',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1761010569',id=148,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-19om0imj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:21Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=9769ee2c-2b6d-451a-a99a-d5f5cb51d643,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.705 2 DEBUG nova.network.os_vif_util [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.706 2 DEBUG nova.network.os_vif_util [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.706 2 DEBUG os_vif [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15f8dc0f-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15f8dc0f-ac, col_values=(('external_ids', {'iface-id': '15f8dc0f-ac99-40ed-bab7-50fe9de13594', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:7f:2c', 'vm-uuid': '9769ee2c-2b6d-451a-a99a-d5f5cb51d643'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:25 np0005466012 NetworkManager[51207]: <info>  [1759408405.7145] manager: (tap15f8dc0f-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.718 2 INFO os_vif [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac')#033[00m
Oct  2 08:33:25 np0005466012 podman[244601]: 2025-10-02 12:33:25.827662612 +0000 UTC m=+0.067673030 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct  2 08:33:25 np0005466012 podman[244602]: 2025-10-02 12:33:25.835637763 +0000 UTC m=+0.074981873 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter)
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.857 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.858 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.858 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] No VIF found with MAC fa:16:3e:60:7f:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:25 np0005466012 nova_compute[192063]: 2025-10-02 12:33:25.858 2 INFO nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Using config drive#033[00m
Oct  2 08:33:27 np0005466012 nova_compute[192063]: 2025-10-02 12:33:27.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:27 np0005466012 nova_compute[192063]: 2025-10-02 12:33:27.652 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408392.650676, 9e9408e2-5973-4e12-b904-711ea96bfbb2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:27 np0005466012 nova_compute[192063]: 2025-10-02 12:33:27.653 2 INFO nova.compute.manager [-] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:33:27 np0005466012 nova_compute[192063]: 2025-10-02 12:33:27.680 2 DEBUG nova.compute.manager [None req-e2814e8b-e4e7-453f-b7cd-8e515df9af1d - - - - - -] [instance: 9e9408e2-5973-4e12-b904-711ea96bfbb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:27 np0005466012 nova_compute[192063]: 2025-10-02 12:33:27.940 2 INFO nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Creating config drive at /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.config#033[00m
Oct  2 08:33:27 np0005466012 nova_compute[192063]: 2025-10-02 12:33:27.950 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph0cfgi2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.094 2 DEBUG oslo_concurrency.processutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph0cfgi2b" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.102 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408393.0999653, 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.102 2 INFO nova.compute.manager [-] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.136 2 DEBUG nova.compute.manager [None req-37ecd1be-1dfb-42ae-8886-aa8fb290aca3 - - - - - -] [instance: 4b2b0338-e64b-41eb-8902-3d7a95c6ffb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:28 np0005466012 podman[244646]: 2025-10-02 12:33:28.152256045 +0000 UTC m=+0.063818504 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:33:28 np0005466012 kernel: tap15f8dc0f-ac: entered promiscuous mode
Oct  2 08:33:28 np0005466012 NetworkManager[51207]: <info>  [1759408408.1614] manager: (tap15f8dc0f-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Oct  2 08:33:28 np0005466012 podman[244647]: 2025-10-02 12:33:28.173621487 +0000 UTC m=+0.079665643 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:28Z|00593|binding|INFO|Claiming lport 15f8dc0f-ac99-40ed-bab7-50fe9de13594 for this chassis.
Oct  2 08:33:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:28Z|00594|binding|INFO|15f8dc0f-ac99-40ed-bab7-50fe9de13594: Claiming fa:16:3e:60:7f:2c 10.100.0.12
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.224 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:7f:2c 10.100.0.12'], port_security=['fa:16:3e:60:7f:2c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9769ee2c-2b6d-451a-a99a-d5f5cb51d643', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=15f8dc0f-ac99-40ed-bab7-50fe9de13594) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.225 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 15f8dc0f-ac99-40ed-bab7-50fe9de13594 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 bound to our chassis#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.226 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.237 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[92199201-f407-44e9-93d9-209a3744e767]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.237 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1acf42c5-01 in ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.239 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1acf42c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.239 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c63c4163-92fd-4b49-9b08-088525b44953]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.239 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[87ee0eaa-6cf8-4340-b8db-c9c56c945fd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 systemd-machined[152114]: New machine qemu-69-instance-00000094.
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.249 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[faaee317-8e3b-4049-9ef6-9110de645212]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.265 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf89894-6922-48dd-ac4c-6bf1ba07b7c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:28Z|00595|binding|INFO|Setting lport 15f8dc0f-ac99-40ed-bab7-50fe9de13594 ovn-installed in OVS
Oct  2 08:33:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:28Z|00596|binding|INFO|Setting lport 15f8dc0f-ac99-40ed-bab7-50fe9de13594 up in Southbound
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466012 systemd[1]: Started Virtual Machine qemu-69-instance-00000094.
Oct  2 08:33:28 np0005466012 systemd-udevd[244710]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.293 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb95ead-af27-44d3-8400-cac650801c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 NetworkManager[51207]: <info>  [1759408408.3004] manager: (tap1acf42c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/273)
Oct  2 08:33:28 np0005466012 NetworkManager[51207]: <info>  [1759408408.3015] device (tap15f8dc0f-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.299 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[99608b11-dd34-4c82-a8bf-fbab02463111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 systemd-udevd[244712]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:28 np0005466012 NetworkManager[51207]: <info>  [1759408408.3025] device (tap15f8dc0f-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.326 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cf622d-5a86-4193-b3c0-c29b3ed9715f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.329 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[33bac401-87db-4823-8c20-baf234faea99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 NetworkManager[51207]: <info>  [1759408408.3515] device (tap1acf42c5-00): carrier: link connected
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.356 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[62966dc8-19a1-4677-884c-84a24ecd21d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.363 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.363 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.375 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0f10c691-f1a4-4070-829c-0fe8daa896a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640196, 'reachable_time': 23056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244738, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.385 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.396 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b72ab00c-b359-4607-a5df-395503e29568]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:5bcd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 640196, 'tstamp': 640196}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244739, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.416 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a1cd97ef-31c3-414f-96c3-d19e2b419112]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1acf42c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:5b:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640196, 'reachable_time': 23056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244740, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.456 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[26c2d40c-983a-415b-bb30-2480edeaee82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.542 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ee875f75-9f4e-4288-8a25-f015b52c9f7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.545 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.546 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.546 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1acf42c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466012 kernel: tap1acf42c5-00: entered promiscuous mode
Oct  2 08:33:28 np0005466012 NetworkManager[51207]: <info>  [1759408408.5496] manager: (tap1acf42c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.553 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1acf42c5-00, col_values=(('external_ids', {'iface-id': 'c198cb2e-a850-46e4-8295-a2f9c280ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:28Z|00597|binding|INFO|Releasing lport c198cb2e-a850-46e4-8295-a2f9c280ee53 from this chassis (sb_readonly=0)
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.558 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.559 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.565 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.566 2 INFO nova.compute.claims [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.580 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.581 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3a6cb3-1d3e-496c-9a2d-480ea1db2a01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.583 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/1acf42c5-084c-4cc4-bdc5-910eec0249e3.pid.haproxy
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 1acf42c5-084c-4cc4-bdc5-910eec0249e3
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:33:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:28.584 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'env', 'PROCESS_TAG=haproxy-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1acf42c5-084c-4cc4-bdc5-910eec0249e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.639 2 DEBUG nova.compute.manager [req-2107bd6e-9f10-42d2-89f0-37efd15b4c2a req-2fb6e407-e07d-46cb-afa9-81f4e3f5a994 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.640 2 DEBUG oslo_concurrency.lockutils [req-2107bd6e-9f10-42d2-89f0-37efd15b4c2a req-2fb6e407-e07d-46cb-afa9-81f4e3f5a994 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.640 2 DEBUG oslo_concurrency.lockutils [req-2107bd6e-9f10-42d2-89f0-37efd15b4c2a req-2fb6e407-e07d-46cb-afa9-81f4e3f5a994 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.640 2 DEBUG oslo_concurrency.lockutils [req-2107bd6e-9f10-42d2-89f0-37efd15b4c2a req-2fb6e407-e07d-46cb-afa9-81f4e3f5a994 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.641 2 DEBUG nova.compute.manager [req-2107bd6e-9f10-42d2-89f0-37efd15b4c2a req-2fb6e407-e07d-46cb-afa9-81f4e3f5a994 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Processing event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.701 2 DEBUG nova.compute.provider_tree [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.715 2 DEBUG nova.scheduler.client.report [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.733 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.733 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.787 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.787 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.806 2 INFO nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.830 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.956 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.959 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.960 2 INFO nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Creating image(s)#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.961 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.962 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.963 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.991 2 DEBUG nova.network.neutron [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Updated VIF entry in instance network info cache for port 15f8dc0f-ac99-40ed-bab7-50fe9de13594. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.992 2 DEBUG nova.network.neutron [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Updating instance_info_cache with network_info: [{"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:28 np0005466012 nova_compute[192063]: 2025-10-02 12:33:28.994 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:29 np0005466012 podman[244777]: 2025-10-02 12:33:29.017110934 +0000 UTC m=+0.092322254 container create 7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.042 2 DEBUG oslo_concurrency.lockutils [req-7a05492c-e543-44b7-ad6a-2d84531b1534 req-b40fef5c-8164-489f-9adb-338e290cf245 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9769ee2c-2b6d-451a-a99a-d5f5cb51d643" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:29 np0005466012 podman[244777]: 2025-10-02 12:33:28.974293615 +0000 UTC m=+0.049504995 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:33:29 np0005466012 systemd[1]: Started libpod-conmon-7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142.scope.
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.087 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.088 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.089 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.100 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:29 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:33:29 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8b3f0e62924467ffaacf8c4db0c454d1364accf2e2bba5be851bbf5c0e70513/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:33:29 np0005466012 podman[244777]: 2025-10-02 12:33:29.126431859 +0000 UTC m=+0.201643149 container init 7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS)
Oct  2 08:33:29 np0005466012 podman[244777]: 2025-10-02 12:33:29.133530655 +0000 UTC m=+0.208741945 container start 7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.138 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.139 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408409.1375456, 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.140 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] VM Started (Lifecycle Event)#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.143 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.151 2 INFO nova.virt.libvirt.driver [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Instance spawned successfully.#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.151 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.156 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:29 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [NOTICE]   (244800) : New worker (244804) forked
Oct  2 08:33:29 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [NOTICE]   (244800) : Loading success.
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.157 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.178 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.185 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.188 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.189 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.189 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.190 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.190 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.191 2 DEBUG nova.virt.libvirt.driver [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.194 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.195 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.195 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.232 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.233 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408409.1377146, 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.233 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.259 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.263 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408409.142393, 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.263 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.271 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.272 2 DEBUG nova.virt.disk.api [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.273 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.311 2 INFO nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Took 8.18 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.312 2 DEBUG nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.330 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.335 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.352 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.352 2 DEBUG nova.virt.disk.api [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.353 2 DEBUG nova.objects.instance [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 619d5560-c0d0-4c72-9778-96b5f71ac7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.369 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.380 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.380 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Ensure instance console log exists: /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.381 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.381 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.381 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.413 2 INFO nova.compute.manager [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Took 8.81 seconds to build instance.#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.426 2 DEBUG oslo_concurrency.lockutils [None req-ac25f1a5-c670-47c0-a3cc-8289fa164060 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:29 np0005466012 nova_compute[192063]: 2025-10-02 12:33:29.929 2 DEBUG nova.policy [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.758 2 DEBUG nova.compute.manager [req-97796233-6754-49de-8556-28f2cc5fc9ac req-c38068ad-8186-4cd2-a61e-6d2959db1a73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.758 2 DEBUG oslo_concurrency.lockutils [req-97796233-6754-49de-8556-28f2cc5fc9ac req-c38068ad-8186-4cd2-a61e-6d2959db1a73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.759 2 DEBUG oslo_concurrency.lockutils [req-97796233-6754-49de-8556-28f2cc5fc9ac req-c38068ad-8186-4cd2-a61e-6d2959db1a73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.760 2 DEBUG oslo_concurrency.lockutils [req-97796233-6754-49de-8556-28f2cc5fc9ac req-c38068ad-8186-4cd2-a61e-6d2959db1a73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.760 2 DEBUG nova.compute.manager [req-97796233-6754-49de-8556-28f2cc5fc9ac req-c38068ad-8186-4cd2-a61e-6d2959db1a73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] No waiting events found dispatching network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:30 np0005466012 nova_compute[192063]: 2025-10-02 12:33:30.761 2 WARNING nova.compute.manager [req-97796233-6754-49de-8556-28f2cc5fc9ac req-c38068ad-8186-4cd2-a61e-6d2959db1a73 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received unexpected event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:32 np0005466012 nova_compute[192063]: 2025-10-02 12:33:32.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:32 np0005466012 nova_compute[192063]: 2025-10-02 12:33:32.610 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Successfully created port: eeff04b2-b580-4bba-b737-91de18ae78cc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:33 np0005466012 nova_compute[192063]: 2025-10-02 12:33:33.399 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Successfully created port: c0631492-3acc-4088-b2ff-bff23df03863 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:33:33 np0005466012 nova_compute[192063]: 2025-10-02 12:33:33.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.232 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Successfully updated port: eeff04b2-b580-4bba-b737-91de18ae78cc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.311 2 DEBUG nova.compute.manager [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-changed-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.312 2 DEBUG nova.compute.manager [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing instance network info cache due to event network-changed-eeff04b2-b580-4bba-b737-91de18ae78cc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.312 2 DEBUG oslo_concurrency.lockutils [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.312 2 DEBUG oslo_concurrency.lockutils [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.312 2 DEBUG nova.network.neutron [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing network info cache for port eeff04b2-b580-4bba-b737-91de18ae78cc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.538 2 DEBUG nova.network.neutron [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.541 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.542 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.542 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.543 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.543 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.555 2 INFO nova.compute.manager [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Terminating instance#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.568 2 DEBUG nova.compute.manager [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:33:34 np0005466012 kernel: tap15f8dc0f-ac (unregistering): left promiscuous mode
Oct  2 08:33:34 np0005466012 NetworkManager[51207]: <info>  [1759408414.5919] device (tap15f8dc0f-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:33:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:34Z|00598|binding|INFO|Releasing lport 15f8dc0f-ac99-40ed-bab7-50fe9de13594 from this chassis (sb_readonly=0)
Oct  2 08:33:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:34Z|00599|binding|INFO|Setting lport 15f8dc0f-ac99-40ed-bab7-50fe9de13594 down in Southbound
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:34Z|00600|binding|INFO|Removing iface tap15f8dc0f-ac ovn-installed in OVS
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.615 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:7f:2c 10.100.0.12'], port_security=['fa:16:3e:60:7f:2c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9769ee2c-2b6d-451a-a99a-d5f5cb51d643', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a4a7099974504a798e1607c8e6a1f570', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99e51855-93ef-45a8-a4a3-2b0a8aec1882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=498d5b4e-c711-4633-9705-7db30a0fb056, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=15f8dc0f-ac99-40ed-bab7-50fe9de13594) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.616 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 15f8dc0f-ac99-40ed-bab7-50fe9de13594 in datapath 1acf42c5-084c-4cc4-bdc5-910eec0249e3 unbound from our chassis#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.618 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1acf42c5-084c-4cc4-bdc5-910eec0249e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.619 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a62b7175-1c15-4866-99a3-455926382f0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.620 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 namespace which is not needed anymore#033[00m
Oct  2 08:33:34 np0005466012 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000094.scope: Deactivated successfully.
Oct  2 08:33:34 np0005466012 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000094.scope: Consumed 6.230s CPU time.
Oct  2 08:33:34 np0005466012 systemd-machined[152114]: Machine qemu-69-instance-00000094 terminated.
Oct  2 08:33:34 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [NOTICE]   (244800) : haproxy version is 2.8.14-c23fe91
Oct  2 08:33:34 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [NOTICE]   (244800) : path to executable is /usr/sbin/haproxy
Oct  2 08:33:34 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [WARNING]  (244800) : Exiting Master process...
Oct  2 08:33:34 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [WARNING]  (244800) : Exiting Master process...
Oct  2 08:33:34 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [ALERT]    (244800) : Current worker (244804) exited with code 143 (Terminated)
Oct  2 08:33:34 np0005466012 neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3[244793]: [WARNING]  (244800) : All workers exited. Exiting... (0)
Oct  2 08:33:34 np0005466012 systemd[1]: libpod-7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142.scope: Deactivated successfully.
Oct  2 08:33:34 np0005466012 conmon[244793]: conmon 7f910206f704c171dce4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142.scope/container/memory.events
Oct  2 08:33:34 np0005466012 podman[244846]: 2025-10-02 12:33:34.746189009 +0000 UTC m=+0.043348765 container died 7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:33:34 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142-userdata-shm.mount: Deactivated successfully.
Oct  2 08:33:34 np0005466012 systemd[1]: var-lib-containers-storage-overlay-f8b3f0e62924467ffaacf8c4db0c454d1364accf2e2bba5be851bbf5c0e70513-merged.mount: Deactivated successfully.
Oct  2 08:33:34 np0005466012 podman[244846]: 2025-10-02 12:33:34.794659134 +0000 UTC m=+0.091818890 container cleanup 7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:33:34 np0005466012 systemd[1]: libpod-conmon-7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142.scope: Deactivated successfully.
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.842 2 INFO nova.virt.libvirt.driver [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Instance destroyed successfully.#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.843 2 DEBUG nova.objects.instance [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lazy-loading 'resources' on Instance uuid 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:34 np0005466012 podman[244882]: 2025-10-02 12:33:34.850021041 +0000 UTC m=+0.036999978 container remove 7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.856 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1d71e6fe-d21e-451f-bde5-60c0bc0f9508]: (4, ('Thu Oct  2 12:33:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142)\n7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142\nThu Oct  2 12:33:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 (7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142)\n7f910206f704c171dce485b9f6488c13a2229a680fa8cee11394a2d63a943142\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.857 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f65f9330-388d-493a-824f-ac54c839cd2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.858 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1acf42c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.858 2 DEBUG nova.virt.libvirt.vif [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1761010569',display_name='tempest-ServersTestJSON-server-1761010569',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1761010569',id=148,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a4a7099974504a798e1607c8e6a1f570',ramdisk_id='',reservation_id='r-19om0imj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1163535506',owner_user_name='tempest-ServersTestJSON-1163535506-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:33:32Z,user_data=None,user_id='27daa263abb54d4d8e3ae34cd1c5ccf5',uuid=9769ee2c-2b6d-451a-a99a-d5f5cb51d643,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.859 2 DEBUG nova.network.os_vif_util [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converting VIF {"id": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "address": "fa:16:3e:60:7f:2c", "network": {"id": "1acf42c5-084c-4cc4-bdc5-910eec0249e3", "bridge": "br-int", "label": "tempest-ServersTestJSON-5464492-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a4a7099974504a798e1607c8e6a1f570", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15f8dc0f-ac", "ovs_interfaceid": "15f8dc0f-ac99-40ed-bab7-50fe9de13594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.859 2 DEBUG nova.network.os_vif_util [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.860 2 DEBUG os_vif [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 kernel: tap1acf42c5-00: left promiscuous mode
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.862 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15f8dc0f-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.876 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6281b8ec-8059-4d74-b147-c9c2abf3a3c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.877 2 INFO os_vif [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:7f:2c,bridge_name='br-int',has_traffic_filtering=True,id=15f8dc0f-ac99-40ed-bab7-50fe9de13594,network=Network(1acf42c5-084c-4cc4-bdc5-910eec0249e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15f8dc0f-ac')#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.877 2 INFO nova.virt.libvirt.driver [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Deleting instance files /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643_del#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.878 2 INFO nova.virt.libvirt.driver [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Deletion of /var/lib/nova/instances/9769ee2c-2b6d-451a-a99a-d5f5cb51d643_del complete#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.909 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ff95b7-f682-4497-9b14-1a270b806efc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.910 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fd913fea-15cc-42ff-a24b-fe31be6dbeea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.925 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[799617b7-b9b5-4dc5-b502-512f3db76d81]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640190, 'reachable_time': 42414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244908, 'error': None, 'target': 'ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.928 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1acf42c5-084c-4cc4-bdc5-910eec0249e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:33:34 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:34.928 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ca1150-1b21-4c79-8ce4-d92d21896cb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:34 np0005466012 systemd[1]: run-netns-ovnmeta\x2d1acf42c5\x2d084c\x2d4cc4\x2dbdc5\x2d910eec0249e3.mount: Deactivated successfully.
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.934 2 INFO nova.compute.manager [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.934 2 DEBUG oslo.service.loopingcall [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.935 2 DEBUG nova.compute.manager [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:33:34 np0005466012 nova_compute[192063]: 2025-10-02 12:33:34.935 2 DEBUG nova.network.neutron [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.006 2 DEBUG nova.network.neutron [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.024 2 DEBUG oslo_concurrency.lockutils [req-4e2c20ea-cab6-4571-9b57-19fd41ccff56 req-a873390e-79a5-4405-a57c-e80067858450 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.553 2 DEBUG nova.network.neutron [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.575 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Successfully updated port: c0631492-3acc-4088-b2ff-bff23df03863 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.591 2 INFO nova.compute.manager [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Took 0.66 seconds to deallocate network for instance.#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.593 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.594 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.594 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.697 2 DEBUG nova.compute.manager [req-7224e414-38c5-401c-b9bc-2d7f30de9aff req-97601e8f-3d1d-4f72-9415-d4caf0793dc2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received event network-vif-deleted-15f8dc0f-ac99-40ed-bab7-50fe9de13594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.777 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.778 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.813 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.872 2 DEBUG nova.compute.provider_tree [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.888 2 DEBUG nova.scheduler.client.report [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.911 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:35 np0005466012 nova_compute[192063]: 2025-10-02 12:33:35.956 2 INFO nova.scheduler.client.report [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Deleted allocations for instance 9769ee2c-2b6d-451a-a99a-d5f5cb51d643#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.037 2 DEBUG oslo_concurrency.lockutils [None req-47868835-175b-4401-af27-7e2b540ca9ba 27daa263abb54d4d8e3ae34cd1c5ccf5 a4a7099974504a798e1607c8e6a1f570 - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.429 2 DEBUG nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received event network-vif-unplugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.431 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.431 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.432 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.432 2 DEBUG nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] No waiting events found dispatching network-vif-unplugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.432 2 WARNING nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received unexpected event network-vif-unplugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.432 2 DEBUG nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.433 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.433 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.433 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9769ee2c-2b6d-451a-a99a-d5f5cb51d643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.433 2 DEBUG nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] No waiting events found dispatching network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.434 2 WARNING nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Received unexpected event network-vif-plugged-15f8dc0f-ac99-40ed-bab7-50fe9de13594 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.434 2 DEBUG nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-changed-c0631492-3acc-4088-b2ff-bff23df03863 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.434 2 DEBUG nova.compute.manager [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing instance network info cache due to event network-changed-c0631492-3acc-4088-b2ff-bff23df03863. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:36 np0005466012 nova_compute[192063]: 2025-10-02 12:33:36.435 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:37 np0005466012 nova_compute[192063]: 2025-10-02 12:33:37.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:39 np0005466012 nova_compute[192063]: 2025-10-02 12:33:39.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:39 np0005466012 nova_compute[192063]: 2025-10-02 12:33:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:39 np0005466012 nova_compute[192063]: 2025-10-02 12:33:39.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.375 2 DEBUG nova.network.neutron [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.439 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.439 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Instance network_info: |[{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.440 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.440 2 DEBUG nova.network.neutron [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing network info cache for port c0631492-3acc-4088-b2ff-bff23df03863 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.445 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Start _get_guest_xml network_info=[{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.452 2 WARNING nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.457 2 DEBUG nova.virt.libvirt.host [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.458 2 DEBUG nova.virt.libvirt.host [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.464 2 DEBUG nova.virt.libvirt.host [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.465 2 DEBUG nova.virt.libvirt.host [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.466 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.466 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.466 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.467 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.467 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.467 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.467 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.468 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.468 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.468 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.468 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.469 2 DEBUG nova.virt.hardware [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.472 2 DEBUG nova.virt.libvirt.vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1162098003',display_name='tempest-TestGettingAddress-server-1162098003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1162098003',id=149,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-58cfhsbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:28Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619d5560-c0d0-4c72-9778-96b5f71ac7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.473 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.473 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.474 2 DEBUG nova.virt.libvirt.vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1162098003',display_name='tempest-TestGettingAddress-server-1162098003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1162098003',id=149,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-58cfhsbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:28Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619d5560-c0d0-4c72-9778-96b5f71ac7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.474 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.475 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.476 2 DEBUG nova.objects.instance [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 619d5560-c0d0-4c72-9778-96b5f71ac7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.512 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <uuid>619d5560-c0d0-4c72-9778-96b5f71ac7f2</uuid>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <name>instance-00000095</name>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestGettingAddress-server-1162098003</nova:name>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:33:41</nova:creationTime>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:port uuid="eeff04b2-b580-4bba-b737-91de18ae78cc">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        <nova:port uuid="c0631492-3acc-4088-b2ff-bff23df03863">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe7e:202d" ipVersion="6"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe7e:202d" ipVersion="6"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <entry name="serial">619d5560-c0d0-4c72-9778-96b5f71ac7f2</entry>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <entry name="uuid">619d5560-c0d0-4c72-9778-96b5f71ac7f2</entry>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.config"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:6c:b3:03"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <target dev="tapeeff04b2-b5"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:7e:20:2d"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <target dev="tapc0631492-3a"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/console.log" append="off"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:33:41 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:33:41 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:33:41 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:33:41 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.514 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Preparing to wait for external event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.514 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.514 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.515 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.515 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Preparing to wait for external event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.515 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.516 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.516 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.517 2 DEBUG nova.virt.libvirt.vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1162098003',display_name='tempest-TestGettingAddress-server-1162098003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1162098003',id=149,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-58cfhsbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:28Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619d5560-c0d0-4c72-9778-96b5f71ac7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.517 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.517 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.518 2 DEBUG os_vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.521 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeeff04b2-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.522 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeeff04b2-b5, col_values=(('external_ids', {'iface-id': 'eeff04b2-b580-4bba-b737-91de18ae78cc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:b3:03', 'vm-uuid': '619d5560-c0d0-4c72-9778-96b5f71ac7f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:41 np0005466012 NetworkManager[51207]: <info>  [1759408421.5610] manager: (tapeeff04b2-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.566 2 INFO os_vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5')#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.567 2 DEBUG nova.virt.libvirt.vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:33:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1162098003',display_name='tempest-TestGettingAddress-server-1162098003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1162098003',id=149,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-58cfhsbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:33:28Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619d5560-c0d0-4c72-9778-96b5f71ac7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.567 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.568 2 DEBUG nova.network.os_vif_util [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.568 2 DEBUG os_vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.571 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0631492-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.571 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0631492-3a, col_values=(('external_ids', {'iface-id': 'c0631492-3acc-4088-b2ff-bff23df03863', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:20:2d', 'vm-uuid': '619d5560-c0d0-4c72-9778-96b5f71ac7f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 NetworkManager[51207]: <info>  [1759408421.5732] manager: (tapc0631492-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.581 2 INFO os_vif [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a')#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.713 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.713 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.713 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:6c:b3:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.714 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:7e:20:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:33:41 np0005466012 nova_compute[192063]: 2025-10-02 12:33:41.714 2 INFO nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Using config drive#033[00m
Oct  2 08:33:42 np0005466012 nova_compute[192063]: 2025-10-02 12:33:42.087 2 INFO nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Creating config drive at /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.config#033[00m
Oct  2 08:33:42 np0005466012 nova_compute[192063]: 2025-10-02 12:33:42.091 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6q5y9r9z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:42 np0005466012 nova_compute[192063]: 2025-10-02 12:33:42.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:42 np0005466012 nova_compute[192063]: 2025-10-02 12:33:42.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:42 np0005466012 nova_compute[192063]: 2025-10-02 12:33:42.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:33:43 np0005466012 podman[244917]: 2025-10-02 12:33:43.165158658 +0000 UTC m=+0.076697480 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:33:43 np0005466012 podman[244918]: 2025-10-02 12:33:43.190231254 +0000 UTC m=+0.096916402 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.432 2 DEBUG oslo_concurrency.processutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6q5y9r9z" returned: 0 in 1.340s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5073] manager: (tapeeff04b2-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Oct  2 08:33:43 np0005466012 kernel: tapeeff04b2-b5: entered promiscuous mode
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00601|binding|INFO|Claiming lport eeff04b2-b580-4bba-b737-91de18ae78cc for this chassis.
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00602|binding|INFO|eeff04b2-b580-4bba-b737-91de18ae78cc: Claiming fa:16:3e:6c:b3:03 10.100.0.6
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5244] manager: (tapc0631492-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/278)
Oct  2 08:33:43 np0005466012 kernel: tapc0631492-3a: entered promiscuous mode
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 systemd-udevd[244984]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:43 np0005466012 systemd-udevd[244985]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5513] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5519] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00603|if_status|INFO|Not updating pb chassis for c0631492-3acc-4088-b2ff-bff23df03863 now as sb is readonly
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5544] device (tapeeff04b2-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00604|binding|INFO|Claiming lport c0631492-3acc-4088-b2ff-bff23df03863 for this chassis.
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00605|binding|INFO|c0631492-3acc-4088-b2ff-bff23df03863: Claiming fa:16:3e:7e:20:2d 2001:db8:0:1:f816:3eff:fe7e:202d 2001:db8::f816:3eff:fe7e:202d
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5552] device (tapeeff04b2-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:43 np0005466012 systemd-machined[152114]: New machine qemu-70-instance-00000095.
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.554 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:b3:03 10.100.0.6'], port_security=['fa:16:3e:6c:b3:03 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '619d5560-c0d0-4c72-9778-96b5f71ac7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbb75d33-0be1-4472-abdd-63f2f4f59602, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=eeff04b2-b580-4bba-b737-91de18ae78cc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.556 103246 INFO neutron.agent.ovn.metadata.agent [-] Port eeff04b2-b580-4bba-b737-91de18ae78cc in datapath 48ae5e44-4c0f-44dd-b2b0-7bd3123da141 bound to our chassis#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.557 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48ae5e44-4c0f-44dd-b2b0-7bd3123da141#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.560 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:20:2d 2001:db8:0:1:f816:3eff:fe7e:202d 2001:db8::f816:3eff:fe7e:202d'], port_security=['fa:16:3e:7e:20:2d 2001:db8:0:1:f816:3eff:fe7e:202d 2001:db8::f816:3eff:fe7e:202d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7e:202d/64 2001:db8::f816:3eff:fe7e:202d/64', 'neutron:device_id': '619d5560-c0d0-4c72-9778-96b5f71ac7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512667a6-6958-4dd6-8891-fcda7d607ab5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=c0631492-3acc-4088-b2ff-bff23df03863) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5612] device (tapc0631492-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.5624] device (tapc0631492-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.567 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[11d3f555-0372-49a3-b1ef-8259b2ca8032]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.568 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48ae5e44-41 in ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.571 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48ae5e44-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.571 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5a39f76a-3baa-4a40-a844-90522600a17b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.571 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e07ba3e5-f398-4373-acfc-2ae45a302731]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.582 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[944bea1b-a344-4c34-87b3-b5c19fdaf76b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.607 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[26616fe9-83d4-4899-9cc2-b2fedb238c85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.642 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[10766b31-6a3a-43bb-9407-e235eee344c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 systemd-udevd[244989]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.6643] manager: (tap48ae5e44-40): new Veth device (/org/freedesktop/NetworkManager/Devices/281)
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.662 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ba435629-a371-4fdf-8efe-51064a62b96e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.697 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[328d88f8-ff28-4f1d-b370-5855cec50c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.701 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[66eebca5-7ce0-4e34-ae79-2136ab4eaa2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 systemd[1]: Started Virtual Machine qemu-70-instance-00000095.
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.7273] device (tap48ae5e44-40): carrier: link connected
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.739 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[aad0e72f-51ee-457d-91e9-18737a6ff832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00606|binding|INFO|Setting lport eeff04b2-b580-4bba-b737-91de18ae78cc ovn-installed in OVS
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00607|binding|INFO|Setting lport eeff04b2-b580-4bba-b737-91de18ae78cc up in Southbound
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.758 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[87f3ed5d-565e-4fa1-82ed-6cf958f44c29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48ae5e44-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:62:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641733, 'reachable_time': 25562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245013, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00608|binding|INFO|Setting lport c0631492-3acc-4088-b2ff-bff23df03863 ovn-installed in OVS
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00609|binding|INFO|Setting lport c0631492-3acc-4088-b2ff-bff23df03863 up in Southbound
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.776 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d6da7a6f-36cd-4155-a88f-1cb291dc5c73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:6233'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 641733, 'tstamp': 641733}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245014, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.795 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[34c24d74-2a94-4142-9eaa-3313fdb12a6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48ae5e44-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:62:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641733, 'reachable_time': 25562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245017, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.823 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7e0097-f886-471b-a089-54f0f80d26c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.873 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb86f066-fab3-4ee8-83ed-a838f9cc9331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.874 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ae5e44-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.875 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.875 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48ae5e44-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:43 np0005466012 NetworkManager[51207]: <info>  [1759408423.8772] manager: (tap48ae5e44-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Oct  2 08:33:43 np0005466012 kernel: tap48ae5e44-40: entered promiscuous mode
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.879 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48ae5e44-40, col_values=(('external_ids', {'iface-id': 'f8346990-e84e-49ae-958d-dc83725093d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:43Z|00610|binding|INFO|Releasing lport f8346990-e84e-49ae-958d-dc83725093d9 from this chassis (sb_readonly=0)
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.891 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:33:43 np0005466012 nova_compute[192063]: 2025-10-02 12:33:43.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.891 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[75b0a1de-c19f-4922-81b1-f3ee7fa4f1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.892 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-48ae5e44-4c0f-44dd-b2b0-7bd3123da141
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.pid.haproxy
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 48ae5e44-4c0f-44dd-b2b0-7bd3123da141
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:33:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:43.893 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'env', 'PROCESS_TAG=haproxy-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48ae5e44-4c0f-44dd-b2b0-7bd3123da141.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.135 2 DEBUG nova.compute.manager [req-30f47d31-9971-47d3-a910-d641d7d1dba9 req-3a492419-cff4-4a24-8bb7-5b17a0666bdd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.136 2 DEBUG oslo_concurrency.lockutils [req-30f47d31-9971-47d3-a910-d641d7d1dba9 req-3a492419-cff4-4a24-8bb7-5b17a0666bdd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.136 2 DEBUG oslo_concurrency.lockutils [req-30f47d31-9971-47d3-a910-d641d7d1dba9 req-3a492419-cff4-4a24-8bb7-5b17a0666bdd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.136 2 DEBUG oslo_concurrency.lockutils [req-30f47d31-9971-47d3-a910-d641d7d1dba9 req-3a492419-cff4-4a24-8bb7-5b17a0666bdd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.137 2 DEBUG nova.compute.manager [req-30f47d31-9971-47d3-a910-d641d7d1dba9 req-3a492419-cff4-4a24-8bb7-5b17a0666bdd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Processing event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.217 2 DEBUG nova.compute.manager [req-d908ec41-0a7e-45f4-b406-6b1e5c902f3a req-9fdb918c-094a-48e2-9cbd-a7f4ae2108b4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.218 2 DEBUG oslo_concurrency.lockutils [req-d908ec41-0a7e-45f4-b406-6b1e5c902f3a req-9fdb918c-094a-48e2-9cbd-a7f4ae2108b4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.218 2 DEBUG oslo_concurrency.lockutils [req-d908ec41-0a7e-45f4-b406-6b1e5c902f3a req-9fdb918c-094a-48e2-9cbd-a7f4ae2108b4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.218 2 DEBUG oslo_concurrency.lockutils [req-d908ec41-0a7e-45f4-b406-6b1e5c902f3a req-9fdb918c-094a-48e2-9cbd-a7f4ae2108b4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.219 2 DEBUG nova.compute.manager [req-d908ec41-0a7e-45f4-b406-6b1e5c902f3a req-9fdb918c-094a-48e2-9cbd-a7f4ae2108b4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Processing event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:33:44 np0005466012 podman[245061]: 2025-10-02 12:33:44.306427081 +0000 UTC m=+0.066306641 container create e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:33:44 np0005466012 systemd[1]: Started libpod-conmon-e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d.scope.
Oct  2 08:33:44 np0005466012 podman[245061]: 2025-10-02 12:33:44.270045371 +0000 UTC m=+0.029924971 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:33:44 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:33:44 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f086dfaeeb056c2b673f87b19136836ac7d6ff602b4855b0947518f3b89bbbe8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:33:44 np0005466012 podman[245061]: 2025-10-02 12:33:44.4375282 +0000 UTC m=+0.197407780 container init e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:33:44 np0005466012 podman[245061]: 2025-10-02 12:33:44.443197708 +0000 UTC m=+0.203077268 container start e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:33:44 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [NOTICE]   (245080) : New worker (245082) forked
Oct  2 08:33:44 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [NOTICE]   (245080) : Loading success.
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.503 103246 INFO neutron.agent.ovn.metadata.agent [-] Port c0631492-3acc-4088-b2ff-bff23df03863 in datapath f55e0845-fc62-481d-a70d-8546faf2b8fb unbound from our chassis#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.505 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f55e0845-fc62-481d-a70d-8546faf2b8fb#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.514 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fc78a330-97e9-4ad3-96c2-699cdfafdc48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.515 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf55e0845-f1 in ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.518 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf55e0845-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.519 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[09894a10-ebb4-40cb-b93c-a25f4dca518d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.519 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0095764c-165d-419c-a49a-25dbd280c383]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.532 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e62ef188-0c90-40ac-8ced-552105151462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.545 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.547 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408424.5448122, 619d5560-c0d0-4c72-9778-96b5f71ac7f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.547 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] VM Started (Lifecycle Event)#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.550 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.553 2 INFO nova.virt.libvirt.driver [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Instance spawned successfully.#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.553 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.553 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b01ea0dc-5041-4f64-aa16-22abd981ba70]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.584 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[56d5f124-84c0-4d97-a6cf-9a481cb00760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.586 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.591 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:44 np0005466012 systemd-udevd[245005]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.593 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[61c414b7-b568-4664-a0b5-1b3d44a580b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 NetworkManager[51207]: <info>  [1759408424.5940] manager: (tapf55e0845-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/283)
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.596 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.596 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.597 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.597 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.597 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.598 2 DEBUG nova.virt.libvirt.driver [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.623 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[960e1843-c102-401c-badc-fecfe47101af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.626 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ef90815b-f791-41d5-a51b-93d6cf782cff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.647 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.647 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408424.5478249, 619d5560-c0d0-4c72-9778-96b5f71ac7f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.647 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:33:44 np0005466012 NetworkManager[51207]: <info>  [1759408424.6544] device (tapf55e0845-f0): carrier: link connected
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.663 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad19886-f8d1-4ce7-9e57-53bcc57cc361]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.681 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[77b41dea-65e3-4f1e-bcfa-ed372c3873c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf55e0845-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:76:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641826, 'reachable_time': 37899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245101, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.688 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.690 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408424.550072, 619d5560-c0d0-4c72-9778-96b5f71ac7f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.691 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.697 2 INFO nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Took 15.74 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.697 2 DEBUG nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.697 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a4db2ed4-ce0b-4332-b498-6baa9e0e39b8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:762a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 641826, 'tstamp': 641826}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245102, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.705 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.708 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.718 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2265b0fa-f631-4e77-af32-8190eac8fade]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf55e0845-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:76:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641826, 'reachable_time': 37899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245103, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.742 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.750 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4fa35c-56e7-4056-99df-cae8926e2006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.784 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2e13c9-6ad5-414f-b312-514875c4d826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.786 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf55e0845-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.786 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.787 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf55e0845-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:44 np0005466012 kernel: tapf55e0845-f0: entered promiscuous mode
Oct  2 08:33:44 np0005466012 NetworkManager[51207]: <info>  [1759408424.7913] manager: (tapf55e0845-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.793 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf55e0845-f0, col_values=(('external_ids', {'iface-id': '763e1f51-8560-461a-a2f3-3c284c8e5a17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:33:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:44Z|00611|binding|INFO|Releasing lport 763e1f51-8560-461a-a2f3-3c284c8e5a17 from this chassis (sb_readonly=0)
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.796 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f55e0845-fc62-481d-a70d-8546faf2b8fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f55e0845-fc62-481d-a70d-8546faf2b8fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.797 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2255ba47-9dc2-45e9-8753-a71b3c8a0271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.798 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-f55e0845-fc62-481d-a70d-8546faf2b8fb
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/f55e0845-fc62-481d-a70d-8546faf2b8fb.pid.haproxy
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID f55e0845-fc62-481d-a70d-8546faf2b8fb
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:33:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:33:44.798 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'env', 'PROCESS_TAG=haproxy-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f55e0845-fc62-481d-a70d-8546faf2b8fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.807 2 INFO nova.compute.manager [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Took 16.28 seconds to build instance.#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.839 2 DEBUG nova.network.neutron [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updated VIF entry in instance network info cache for port c0631492-3acc-4088-b2ff-bff23df03863. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.840 2 DEBUG nova.network.neutron [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.843 2 DEBUG oslo_concurrency.lockutils [None req-e9a6ddfb-c342-4dc0-b043-6155c52cbdb6 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:44 np0005466012 nova_compute[192063]: 2025-10-02 12:33:44.856 2 DEBUG oslo_concurrency.lockutils [req-800a5e3d-7859-4a24-8f45-f1e730edc36b req-5e82cd89-e8d1-478c-827c-c24b03b0440b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:45 np0005466012 nova_compute[192063]: 2025-10-02 12:33:45.034 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:45 np0005466012 nova_compute[192063]: 2025-10-02 12:33:45.034 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:45 np0005466012 nova_compute[192063]: 2025-10-02 12:33:45.034 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:33:45 np0005466012 nova_compute[192063]: 2025-10-02 12:33:45.034 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 619d5560-c0d0-4c72-9778-96b5f71ac7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:33:45 np0005466012 podman[245133]: 2025-10-02 12:33:45.179867688 +0000 UTC m=+0.046493001 container create d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:33:45 np0005466012 systemd[1]: Started libpod-conmon-d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4.scope.
Oct  2 08:33:45 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:33:45 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/741f97396c0a7f7ecabdbe2d016558cbc1a45c9871a85760bd73529d1931506c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:33:45 np0005466012 podman[245133]: 2025-10-02 12:33:45.15542029 +0000 UTC m=+0.022045623 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:33:45 np0005466012 podman[245147]: 2025-10-02 12:33:45.255420986 +0000 UTC m=+0.047225782 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:33:45 np0005466012 podman[245133]: 2025-10-02 12:33:45.263349366 +0000 UTC m=+0.129974699 container init d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:33:45 np0005466012 podman[245146]: 2025-10-02 12:33:45.266679818 +0000 UTC m=+0.060241452 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:33:45 np0005466012 podman[245133]: 2025-10-02 12:33:45.273962651 +0000 UTC m=+0.140587964 container start d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:33:45 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [NOTICE]   (245188) : New worker (245191) forked
Oct  2 08:33:45 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [NOTICE]   (245188) : Loading success.
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.423 2 DEBUG nova.compute.manager [req-d5ec2075-8fc3-42fe-b093-866ce8d42adf req-e6c2ff6c-0141-4469-a8f2-7e114616e079 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.423 2 DEBUG oslo_concurrency.lockutils [req-d5ec2075-8fc3-42fe-b093-866ce8d42adf req-e6c2ff6c-0141-4469-a8f2-7e114616e079 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.423 2 DEBUG oslo_concurrency.lockutils [req-d5ec2075-8fc3-42fe-b093-866ce8d42adf req-e6c2ff6c-0141-4469-a8f2-7e114616e079 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.423 2 DEBUG oslo_concurrency.lockutils [req-d5ec2075-8fc3-42fe-b093-866ce8d42adf req-e6c2ff6c-0141-4469-a8f2-7e114616e079 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.424 2 DEBUG nova.compute.manager [req-d5ec2075-8fc3-42fe-b093-866ce8d42adf req-e6c2ff6c-0141-4469-a8f2-7e114616e079 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] No waiting events found dispatching network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.424 2 WARNING nova.compute.manager [req-d5ec2075-8fc3-42fe-b093-866ce8d42adf req-e6c2ff6c-0141-4469-a8f2-7e114616e079 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received unexpected event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.699 2 DEBUG nova.compute.manager [req-82b4352b-e414-4a3d-8e9b-cccdf931e070 req-2e73a7a7-0a77-400a-b9e2-0c01de9d46b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.699 2 DEBUG oslo_concurrency.lockutils [req-82b4352b-e414-4a3d-8e9b-cccdf931e070 req-2e73a7a7-0a77-400a-b9e2-0c01de9d46b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.699 2 DEBUG oslo_concurrency.lockutils [req-82b4352b-e414-4a3d-8e9b-cccdf931e070 req-2e73a7a7-0a77-400a-b9e2-0c01de9d46b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.699 2 DEBUG oslo_concurrency.lockutils [req-82b4352b-e414-4a3d-8e9b-cccdf931e070 req-2e73a7a7-0a77-400a-b9e2-0c01de9d46b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.700 2 DEBUG nova.compute.manager [req-82b4352b-e414-4a3d-8e9b-cccdf931e070 req-2e73a7a7-0a77-400a-b9e2-0c01de9d46b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] No waiting events found dispatching network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:33:46 np0005466012 nova_compute[192063]: 2025-10-02 12:33:46.700 2 WARNING nova.compute.manager [req-82b4352b-e414-4a3d-8e9b-cccdf931e070 req-2e73a7a7-0a77-400a-b9e2-0c01de9d46b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received unexpected event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc for instance with vm_state active and task_state None.#033[00m
Oct  2 08:33:47 np0005466012 nova_compute[192063]: 2025-10-02 12:33:47.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.449 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.506 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.507 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.508 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.542 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.542 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.543 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.543 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.643 2 DEBUG nova.compute.manager [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-changed-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.644 2 DEBUG nova.compute.manager [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing instance network info cache due to event network-changed-eeff04b2-b580-4bba-b737-91de18ae78cc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.644 2 DEBUG oslo_concurrency.lockutils [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.645 2 DEBUG oslo_concurrency.lockutils [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.645 2 DEBUG nova.network.neutron [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing network info cache for port eeff04b2-b580-4bba-b737-91de18ae78cc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.733 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.792 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.793 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.841 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408414.8410435, 9769ee2c-2b6d-451a-a99a-d5f5cb51d643 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.842 2 INFO nova.compute.manager [-] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.846 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:33:49 np0005466012 nova_compute[192063]: 2025-10-02 12:33:49.870 2 DEBUG nova.compute.manager [None req-924ccd6c-1ae0-4150-b87e-ed24cbc0bfa2 - - - - - -] [instance: 9769ee2c-2b6d-451a-a99a-d5f5cb51d643] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.019 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.021 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5493MB free_disk=73.24203491210938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.021 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.022 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.133 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 619d5560-c0d0-4c72-9778-96b5f71ac7f2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.134 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.134 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.183 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.218 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.313 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:33:50 np0005466012 nova_compute[192063]: 2025-10-02 12:33:50.314 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:33:51 np0005466012 nova_compute[192063]: 2025-10-02 12:33:51.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:51 np0005466012 nova_compute[192063]: 2025-10-02 12:33:51.964 2 DEBUG nova.network.neutron [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updated VIF entry in instance network info cache for port eeff04b2-b580-4bba-b737-91de18ae78cc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:33:51 np0005466012 nova_compute[192063]: 2025-10-02 12:33:51.965 2 DEBUG nova.network.neutron [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:33:52 np0005466012 nova_compute[192063]: 2025-10-02 12:33:52.194 2 DEBUG oslo_concurrency.lockutils [req-ec1cdb7c-a1e5-4e2d-977c-f09337e00a44 req-e3b0ea1c-10a6-4242-a5bf-41de989d7ce1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:33:52 np0005466012 nova_compute[192063]: 2025-10-02 12:33:52.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:56 np0005466012 podman[245209]: 2025-10-02 12:33:56.15395279 +0000 UTC m=+0.062970709 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64)
Oct  2 08:33:56 np0005466012 podman[245208]: 2025-10-02 12:33:56.171543029 +0000 UTC m=+0.087671716 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd)
Oct  2 08:33:56 np0005466012 nova_compute[192063]: 2025-10-02 12:33:56.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:57Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:b3:03 10.100.0.6
Oct  2 08:33:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:33:57Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:b3:03 10.100.0.6
Oct  2 08:33:57 np0005466012 nova_compute[192063]: 2025-10-02 12:33:57.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:33:59 np0005466012 podman[245265]: 2025-10-02 12:33:59.135627625 +0000 UTC m=+0.048609871 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:33:59 np0005466012 podman[245264]: 2025-10-02 12:33:59.143964356 +0000 UTC m=+0.059024059 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:01 np0005466012 nova_compute[192063]: 2025-10-02 12:34:01.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:02.149 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:02.150 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:02.151 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.823 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.823 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.824 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.824 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.824 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.825 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.857 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.857 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Image id cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 yields fingerprint 068b233e8d7f49e215e2900dde7d25b776cad955 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.857 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] image cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 at (/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955): checking#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.858 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] image cf60d86d-f1d5-4be4-976e-7488dbdcf0b2 at (/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.860 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.860 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] 619d5560-c0d0-4c72-9778-96b5f71ac7f2 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.861 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] 619d5560-c0d0-4c72-9778-96b5f71ac7f2 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.861 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.951 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.952 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 619d5560-c0d0-4c72-9778-96b5f71ac7f2 is backed by 068b233e8d7f49e215e2900dde7d25b776cad955 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.953 2 WARNING nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.954 2 WARNING nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.954 2 WARNING nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.954 2 WARNING nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.955 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Active base files: /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.955 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Removable base files: /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2 /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25 /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.956 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d7f074efa852dc950deac120296f6eecf48a40d2#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.957 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/aa8a26f21a89d4b2e2a08906454e4360ce404b25#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.957 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/cee2f6f92e6c0ffd6c5e505732de5a1a897a0f3d#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.958 2 INFO nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/1e3186a2e8f789d59bb3974b363889023224d3d8#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.958 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.958 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  2 08:34:02 np0005466012 nova_compute[192063]: 2025-10-02 12:34:02.959 2 DEBUG nova.virt.libvirt.imagecache [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  2 08:34:06 np0005466012 nova_compute[192063]: 2025-10-02 12:34:06.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:07 np0005466012 nova_compute[192063]: 2025-10-02 12:34:07.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.013 2 DEBUG nova.compute.manager [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-changed-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.014 2 DEBUG nova.compute.manager [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing instance network info cache due to event network-changed-eeff04b2-b580-4bba-b737-91de18ae78cc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.014 2 DEBUG oslo_concurrency.lockutils [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.014 2 DEBUG oslo_concurrency.lockutils [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.015 2 DEBUG nova.network.neutron [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Refreshing network info cache for port eeff04b2-b580-4bba-b737-91de18ae78cc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.153 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.154 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.154 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.154 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.155 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.166 2 INFO nova.compute.manager [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Terminating instance#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.176 2 DEBUG nova.compute.manager [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:34:10 np0005466012 kernel: tapeeff04b2-b5 (unregistering): left promiscuous mode
Oct  2 08:34:10 np0005466012 NetworkManager[51207]: <info>  [1759408450.2086] device (tapeeff04b2-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:10Z|00612|binding|INFO|Releasing lport eeff04b2-b580-4bba-b737-91de18ae78cc from this chassis (sb_readonly=0)
Oct  2 08:34:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:10Z|00613|binding|INFO|Setting lport eeff04b2-b580-4bba-b737-91de18ae78cc down in Southbound
Oct  2 08:34:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:10Z|00614|binding|INFO|Removing iface tapeeff04b2-b5 ovn-installed in OVS
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.235 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:b3:03 10.100.0.6'], port_security=['fa:16:3e:6c:b3:03 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '619d5560-c0d0-4c72-9778-96b5f71ac7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbb75d33-0be1-4472-abdd-63f2f4f59602, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=eeff04b2-b580-4bba-b737-91de18ae78cc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.236 103246 INFO neutron.agent.ovn.metadata.agent [-] Port eeff04b2-b580-4bba-b737-91de18ae78cc in datapath 48ae5e44-4c0f-44dd-b2b0-7bd3123da141 unbound from our chassis#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.238 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48ae5e44-4c0f-44dd-b2b0-7bd3123da141, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.240 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b380a1ea-78b3-418d-a9b8-89a05d79d003]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.241 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 namespace which is not needed anymore#033[00m
Oct  2 08:34:10 np0005466012 kernel: tapc0631492-3a (unregistering): left promiscuous mode
Oct  2 08:34:10 np0005466012 NetworkManager[51207]: <info>  [1759408450.2497] device (tapc0631492-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:10Z|00615|binding|INFO|Releasing lport c0631492-3acc-4088-b2ff-bff23df03863 from this chassis (sb_readonly=0)
Oct  2 08:34:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:10Z|00616|binding|INFO|Setting lport c0631492-3acc-4088-b2ff-bff23df03863 down in Southbound
Oct  2 08:34:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:10Z|00617|binding|INFO|Removing iface tapc0631492-3a ovn-installed in OVS
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.271 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:20:2d 2001:db8:0:1:f816:3eff:fe7e:202d 2001:db8::f816:3eff:fe7e:202d'], port_security=['fa:16:3e:7e:20:2d 2001:db8:0:1:f816:3eff:fe7e:202d 2001:db8::f816:3eff:fe7e:202d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7e:202d/64 2001:db8::f816:3eff:fe7e:202d/64', 'neutron:device_id': '619d5560-c0d0-4c72-9778-96b5f71ac7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3e897d9-b083-4f5e-aef4-0a4551c54806', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512667a6-6958-4dd6-8891-fcda7d607ab5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=c0631492-3acc-4088-b2ff-bff23df03863) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000095.scope: Deactivated successfully.
Oct  2 08:34:10 np0005466012 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000095.scope: Consumed 14.190s CPU time.
Oct  2 08:34:10 np0005466012 systemd-machined[152114]: Machine qemu-70-instance-00000095 terminated.
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [NOTICE]   (245080) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [NOTICE]   (245080) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [WARNING]  (245080) : Exiting Master process...
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [ALERT]    (245080) : Current worker (245082) exited with code 143 (Terminated)
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141[245076]: [WARNING]  (245080) : All workers exited. Exiting... (0)
Oct  2 08:34:10 np0005466012 systemd[1]: libpod-e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d.scope: Deactivated successfully.
Oct  2 08:34:10 np0005466012 podman[245340]: 2025-10-02 12:34:10.383343842 +0000 UTC m=+0.048497287 container died e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:34:10 np0005466012 NetworkManager[51207]: <info>  [1759408450.4062] manager: (tapc0631492-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Oct  2 08:34:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay-f086dfaeeb056c2b673f87b19136836ac7d6ff602b4855b0947518f3b89bbbe8-merged.mount: Deactivated successfully.
Oct  2 08:34:10 np0005466012 podman[245340]: 2025-10-02 12:34:10.429461742 +0000 UTC m=+0.094615187 container cleanup e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  2 08:34:10 np0005466012 systemd[1]: libpod-conmon-e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d.scope: Deactivated successfully.
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.449 2 INFO nova.virt.libvirt.driver [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Instance destroyed successfully.#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.449 2 DEBUG nova.objects.instance [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 619d5560-c0d0-4c72-9778-96b5f71ac7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.461 2 DEBUG nova.virt.libvirt.vif [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:33:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1162098003',display_name='tempest-TestGettingAddress-server-1162098003',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1162098003',id=149,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-58cfhsbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:33:44Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619d5560-c0d0-4c72-9778-96b5f71ac7f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.461 2 DEBUG nova.network.os_vif_util [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.462 2 DEBUG nova.network.os_vif_util [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.462 2 DEBUG os_vif [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeeff04b2-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.473 2 INFO os_vif [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:b3:03,bridge_name='br-int',has_traffic_filtering=True,id=eeff04b2-b580-4bba-b737-91de18ae78cc,network=Network(48ae5e44-4c0f-44dd-b2b0-7bd3123da141),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeff04b2-b5')#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.474 2 DEBUG nova.virt.libvirt.vif [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:33:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1162098003',display_name='tempest-TestGettingAddress-server-1162098003',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1162098003',id=149,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOjMDXDlGh7XWskIogxPxg3/debwat727C6FiGGYuKMh4jN83iNAp0gEQFUOyWskdK5DuOPQyXrWxnq0VTv+25W2TLuxAMtNSrcXSqgdODflHSjkV04SZMqyvlJudVrow==',key_name='tempest-TestGettingAddress-1577770950',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:33:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-58cfhsbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:33:44Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=619d5560-c0d0-4c72-9778-96b5f71ac7f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.474 2 DEBUG nova.network.os_vif_util [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.475 2 DEBUG nova.network.os_vif_util [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.476 2 DEBUG os_vif [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.477 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0631492-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.484 2 INFO os_vif [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:20:2d,bridge_name='br-int',has_traffic_filtering=True,id=c0631492-3acc-4088-b2ff-bff23df03863,network=Network(f55e0845-fc62-481d-a70d-8546faf2b8fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0631492-3a')#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.485 2 INFO nova.virt.libvirt.driver [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Deleting instance files /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2_del#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.486 2 INFO nova.virt.libvirt.driver [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Deletion of /var/lib/nova/instances/619d5560-c0d0-4c72-9778-96b5f71ac7f2_del complete#033[00m
Oct  2 08:34:10 np0005466012 podman[245393]: 2025-10-02 12:34:10.499945749 +0000 UTC m=+0.046212754 container remove e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.506 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b58522-58d5-407a-ad9b-886a2d00ce91]: (4, ('Thu Oct  2 12:34:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 (e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d)\ne10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d\nThu Oct  2 12:34:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 (e10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d)\ne10008ec0791896045a26257427f9f8aadb82b9124e6360eec18e0187a928a7d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.508 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9147d6c5-249e-435c-a31f-ff9c9b7c85a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.509 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ae5e44-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:10 np0005466012 kernel: tap48ae5e44-40: left promiscuous mode
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.531 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[33fbfd4b-d96a-4316-8ca7-e06c87d03995]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.554 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7b70ef-285d-4614-ad3e-1b5cf42122ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.555 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d20d9bf9-20bc-4531-b199-b47cfb41927d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.574 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf7ac14-00cd-4090-b309-0a4e250379d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641724, 'reachable_time': 17909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245410, 'error': None, 'target': 'ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.576 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48ae5e44-4c0f-44dd-b2b0-7bd3123da141 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:10 np0005466012 systemd[1]: run-netns-ovnmeta\x2d48ae5e44\x2d4c0f\x2d44dd\x2db2b0\x2d7bd3123da141.mount: Deactivated successfully.
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.576 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[34765947-e5e1-40c8-9ec7-5db5b6f1e65f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.579 103246 INFO neutron.agent.ovn.metadata.agent [-] Port c0631492-3acc-4088-b2ff-bff23df03863 in datapath f55e0845-fc62-481d-a70d-8546faf2b8fb unbound from our chassis#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.580 2 INFO nova.compute.manager [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.582 2 DEBUG oslo.service.loopingcall [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.582 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f55e0845-fc62-481d-a70d-8546faf2b8fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.582 2 DEBUG nova.compute.manager [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.582 2 DEBUG nova.network.neutron [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.583 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[67ae4dc9-d615-4be3-b51f-e5a129ae6531]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.583 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb namespace which is not needed anymore#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.596 2 DEBUG nova.compute.manager [req-e1a53e3c-bafd-4c5b-bfa5-f6222d495b87 req-a27e95f3-3815-436d-9143-a3744343a662 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-unplugged-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.597 2 DEBUG oslo_concurrency.lockutils [req-e1a53e3c-bafd-4c5b-bfa5-f6222d495b87 req-a27e95f3-3815-436d-9143-a3744343a662 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.597 2 DEBUG oslo_concurrency.lockutils [req-e1a53e3c-bafd-4c5b-bfa5-f6222d495b87 req-a27e95f3-3815-436d-9143-a3744343a662 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.598 2 DEBUG oslo_concurrency.lockutils [req-e1a53e3c-bafd-4c5b-bfa5-f6222d495b87 req-a27e95f3-3815-436d-9143-a3744343a662 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.598 2 DEBUG nova.compute.manager [req-e1a53e3c-bafd-4c5b-bfa5-f6222d495b87 req-a27e95f3-3815-436d-9143-a3744343a662 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] No waiting events found dispatching network-vif-unplugged-eeff04b2-b580-4bba-b737-91de18ae78cc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.599 2 DEBUG nova.compute.manager [req-e1a53e3c-bafd-4c5b-bfa5-f6222d495b87 req-a27e95f3-3815-436d-9143-a3744343a662 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-unplugged-eeff04b2-b580-4bba-b737-91de18ae78cc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [NOTICE]   (245188) : haproxy version is 2.8.14-c23fe91
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [NOTICE]   (245188) : path to executable is /usr/sbin/haproxy
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [WARNING]  (245188) : Exiting Master process...
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [WARNING]  (245188) : Exiting Master process...
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [ALERT]    (245188) : Current worker (245191) exited with code 143 (Terminated)
Oct  2 08:34:10 np0005466012 neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb[245169]: [WARNING]  (245188) : All workers exited. Exiting... (0)
Oct  2 08:34:10 np0005466012 systemd[1]: libpod-d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4.scope: Deactivated successfully.
Oct  2 08:34:10 np0005466012 podman[245428]: 2025-10-02 12:34:10.714165106 +0000 UTC m=+0.046243175 container died d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:34:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4-userdata-shm.mount: Deactivated successfully.
Oct  2 08:34:10 np0005466012 systemd[1]: var-lib-containers-storage-overlay-741f97396c0a7f7ecabdbe2d016558cbc1a45c9871a85760bd73529d1931506c-merged.mount: Deactivated successfully.
Oct  2 08:34:10 np0005466012 podman[245428]: 2025-10-02 12:34:10.754113485 +0000 UTC m=+0.086191554 container cleanup d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:34:10 np0005466012 systemd[1]: libpod-conmon-d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4.scope: Deactivated successfully.
Oct  2 08:34:10 np0005466012 podman[245459]: 2025-10-02 12:34:10.82020431 +0000 UTC m=+0.044279641 container remove d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.826 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f97ff694-bdf0-407a-8976-933e84c959fe]: (4, ('Thu Oct  2 12:34:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb (d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4)\nd5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4\nThu Oct  2 12:34:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb (d5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4)\nd5a22692d5581dfa56fda98b858463b6c29f1e4a3c2683f0c3a797ad80de3fb4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.828 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f946e5df-b9ad-425f-8502-ff2092e838c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.830 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf55e0845-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 kernel: tapf55e0845-f0: left promiscuous mode
Oct  2 08:34:10 np0005466012 nova_compute[192063]: 2025-10-02 12:34:10.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.851 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[59b1b98d-5883-43ab-9b03-167e1e6b4c15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.879 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5f7def45-b2af-4eba-a9de-5bd9e62bb5d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.880 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[633331db-7050-4471-83ca-3f21b0fb93fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.897 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae9e920-afff-41ee-ba56-5098fd3364ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641818, 'reachable_time': 40480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245474, 'error': None, 'target': 'ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.899 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f55e0845-fc62-481d-a70d-8546faf2b8fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:34:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:10.899 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[fecd6e58-53fa-4c61-8552-04edc4176a29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.353 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-unplugged-c0631492-3acc-4088-b2ff-bff23df03863 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.354 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.355 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.355 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.356 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] No waiting events found dispatching network-vif-unplugged-c0631492-3acc-4088-b2ff-bff23df03863 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.356 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-unplugged-c0631492-3acc-4088-b2ff-bff23df03863 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.357 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.357 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.358 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.358 2 DEBUG oslo_concurrency.lockutils [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.359 2 DEBUG nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] No waiting events found dispatching network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.359 2 WARNING nova.compute.manager [req-e496957b-379f-4a95-91d7-c391dd819c62 req-c4f546f6-6fef-4e9c-ae60-97c3ab90f173 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received unexpected event network-vif-plugged-c0631492-3acc-4088-b2ff-bff23df03863 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:34:11 np0005466012 systemd[1]: run-netns-ovnmeta\x2df55e0845\x2dfc62\x2d481d\x2da70d\x2d8546faf2b8fb.mount: Deactivated successfully.
Oct  2 08:34:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:11.874 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:11 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:11.875 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.900 2 DEBUG nova.network.neutron [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updated VIF entry in instance network info cache for port eeff04b2-b580-4bba-b737-91de18ae78cc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.901 2 DEBUG nova.network.neutron [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [{"id": "eeff04b2-b580-4bba-b737-91de18ae78cc", "address": "fa:16:3e:6c:b3:03", "network": {"id": "48ae5e44-4c0f-44dd-b2b0-7bd3123da141", "bridge": "br-int", "label": "tempest-network-smoke--581239262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeff04b2-b5", "ovs_interfaceid": "eeff04b2-b580-4bba-b737-91de18ae78cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c0631492-3acc-4088-b2ff-bff23df03863", "address": "fa:16:3e:7e:20:2d", "network": {"id": "f55e0845-fc62-481d-a70d-8546faf2b8fb", "bridge": "br-int", "label": "tempest-network-smoke--2003085585", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7e:202d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0631492-3a", "ovs_interfaceid": "c0631492-3acc-4088-b2ff-bff23df03863", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:11 np0005466012 nova_compute[192063]: 2025-10-02 12:34:11.937 2 DEBUG oslo_concurrency.lockutils [req-4cae42c9-9920-4f27-80c1-87f6fa7af1e3 req-b5606b46-85a2-4b91-b83c-f012dfb66c02 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-619d5560-c0d0-4c72-9778-96b5f71ac7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.321 2 DEBUG nova.network.neutron [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.342 2 INFO nova.compute.manager [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Took 1.76 seconds to deallocate network for instance.#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.424 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.424 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.489 2 DEBUG nova.compute.provider_tree [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.506 2 DEBUG nova.scheduler.client.report [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.533 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.566 2 INFO nova.scheduler.client.report [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 619d5560-c0d0-4c72-9778-96b5f71ac7f2#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.630 2 DEBUG oslo_concurrency.lockutils [None req-dc0846dc-01d7-416a-871d-c3a738e775c2 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.691 2 DEBUG nova.compute.manager [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.692 2 DEBUG oslo_concurrency.lockutils [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.692 2 DEBUG oslo_concurrency.lockutils [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.692 2 DEBUG oslo_concurrency.lockutils [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "619d5560-c0d0-4c72-9778-96b5f71ac7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.693 2 DEBUG nova.compute.manager [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] No waiting events found dispatching network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.693 2 WARNING nova.compute.manager [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received unexpected event network-vif-plugged-eeff04b2-b580-4bba-b737-91de18ae78cc for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.693 2 DEBUG nova.compute.manager [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-deleted-eeff04b2-b580-4bba-b737-91de18ae78cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.694 2 DEBUG nova.compute.manager [req-984927d5-9d9c-4102-9cbd-5d08df647606 req-942665dd-9886-4839-b7dc-4befc2b6f428 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Received event network-vif-deleted-c0631492-3acc-4088-b2ff-bff23df03863 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:12 np0005466012 nova_compute[192063]: 2025-10-02 12:34:12.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:14 np0005466012 podman[245475]: 2025-10-02 12:34:14.146599373 +0000 UTC m=+0.059718249 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:34:14 np0005466012 podman[245476]: 2025-10-02 12:34:14.180213576 +0000 UTC m=+0.091873882 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:14.876 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:15 np0005466012 nova_compute[192063]: 2025-10-02 12:34:15.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:16 np0005466012 podman[245524]: 2025-10-02 12:34:16.14928474 +0000 UTC m=+0.057926580 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:34:16 np0005466012 podman[245525]: 2025-10-02 12:34:16.154509915 +0000 UTC m=+0.059770940 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:34:17 np0005466012 nova_compute[192063]: 2025-10-02 12:34:17.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:20 np0005466012 nova_compute[192063]: 2025-10-02 12:34:20.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:22 np0005466012 nova_compute[192063]: 2025-10-02 12:34:22.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:22 np0005466012 nova_compute[192063]: 2025-10-02 12:34:22.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:23 np0005466012 nova_compute[192063]: 2025-10-02 12:34:23.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:25 np0005466012 nova_compute[192063]: 2025-10-02 12:34:25.448 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408450.4464283, 619d5560-c0d0-4c72-9778-96b5f71ac7f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:25 np0005466012 nova_compute[192063]: 2025-10-02 12:34:25.449 2 INFO nova.compute.manager [-] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:34:25 np0005466012 nova_compute[192063]: 2025-10-02 12:34:25.464 2 DEBUG nova.compute.manager [None req-4959c799-2d08-4b10-a5cc-72c32729bb91 - - - - - -] [instance: 619d5560-c0d0-4c72-9778-96b5f71ac7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:25 np0005466012 nova_compute[192063]: 2025-10-02 12:34:25.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:27 np0005466012 podman[245565]: 2025-10-02 12:34:27.143635284 +0000 UTC m=+0.054570147 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  2 08:34:27 np0005466012 podman[245566]: 2025-10-02 12:34:27.151568894 +0000 UTC m=+0.058042253 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible)
Oct  2 08:34:27 np0005466012 nova_compute[192063]: 2025-10-02 12:34:27.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:30 np0005466012 podman[245604]: 2025-10-02 12:34:30.158952963 +0000 UTC m=+0.066354903 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid)
Oct  2 08:34:30 np0005466012 podman[245605]: 2025-10-02 12:34:30.159633472 +0000 UTC m=+0.065887950 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:34:30 np0005466012 nova_compute[192063]: 2025-10-02 12:34:30.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:32 np0005466012 nova_compute[192063]: 2025-10-02 12:34:32.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:32 np0005466012 nova_compute[192063]: 2025-10-02 12:34:32.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:32 np0005466012 nova_compute[192063]: 2025-10-02 12:34:32.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:34:33 np0005466012 nova_compute[192063]: 2025-10-02 12:34:33.842 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:34 np0005466012 nova_compute[192063]: 2025-10-02 12:34:34.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:35 np0005466012 nova_compute[192063]: 2025-10-02 12:34:35.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:36 np0005466012 nova_compute[192063]: 2025-10-02 12:34:36.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:37 np0005466012 nova_compute[192063]: 2025-10-02 12:34:37.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005466012 nova_compute[192063]: 2025-10-02 12:34:40.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:40 np0005466012 nova_compute[192063]: 2025-10-02 12:34:40.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:41 np0005466012 nova_compute[192063]: 2025-10-02 12:34:41.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:41 np0005466012 nova_compute[192063]: 2025-10-02 12:34:41.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:42 np0005466012 nova_compute[192063]: 2025-10-02 12:34:42.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:43 np0005466012 nova_compute[192063]: 2025-10-02 12:34:43.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:43 np0005466012 nova_compute[192063]: 2025-10-02 12:34:43.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.842 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.864 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.864 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.864 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:44 np0005466012 nova_compute[192063]: 2025-10-02 12:34:44.864 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:34:44 np0005466012 podman[245650]: 2025-10-02 12:34:44.959377948 +0000 UTC m=+0.052972992 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:34:44 np0005466012 podman[245652]: 2025-10-02 12:34:44.987584511 +0000 UTC m=+0.078080788 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.027 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.028 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5713MB free_disk=73.24303817749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.028 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.028 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.105 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.105 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.139 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.162 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.192 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.192 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:45 np0005466012 nova_compute[192063]: 2025-10-02 12:34:45.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:46 np0005466012 nova_compute[192063]: 2025-10-02 12:34:46.171 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:34:47 np0005466012 podman[245701]: 2025-10-02 12:34:47.139867521 +0000 UTC m=+0.053048064 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  2 08:34:47 np0005466012 podman[245700]: 2025-10-02 12:34:47.140726654 +0000 UTC m=+0.060504311 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:34:47 np0005466012 nova_compute[192063]: 2025-10-02 12:34:47.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:50 np0005466012 nova_compute[192063]: 2025-10-02 12:34:50.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:51 np0005466012 nova_compute[192063]: 2025-10-02 12:34:51.880 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:51 np0005466012 nova_compute[192063]: 2025-10-02 12:34:51.880 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:51 np0005466012 nova_compute[192063]: 2025-10-02 12:34:51.904 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.067 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.068 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.074 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.074 2 INFO nova.compute.claims [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.275 2 DEBUG nova.compute.provider_tree [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.291 2 DEBUG nova.scheduler.client.report [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.327 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.328 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.405 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.406 2 DEBUG nova.network.neutron [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.427 2 INFO nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.453 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.631 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.633 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.634 2 INFO nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Creating image(s)#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.635 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "/var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.635 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.637 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "/var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.663 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.714 2 DEBUG nova.policy [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.756 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.757 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.758 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.784 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.866 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.867 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.920 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.921 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.922 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.993 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.995 2 DEBUG nova.virt.disk.api [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Checking if we can resize image /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:34:52 np0005466012 nova_compute[192063]: 2025-10-02 12:34:52.996 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.094 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.096 2 DEBUG nova.virt.disk.api [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Cannot resize image /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.097 2 DEBUG nova.objects.instance [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.113 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.113 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Ensure instance console log exists: /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.114 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.115 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:53 np0005466012 nova_compute[192063]: 2025-10-02 12:34:53.115 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:54 np0005466012 nova_compute[192063]: 2025-10-02 12:34:54.095 2 DEBUG nova.network.neutron [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Successfully created port: 82858700-6e07-4f6e-b7ae-45a35721505d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.445 2 DEBUG nova.network.neutron [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Successfully updated port: 82858700-6e07-4f6e-b7ae-45a35721505d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.462 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.463 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.463 2 DEBUG nova.network.neutron [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.705 2 DEBUG nova.network.neutron [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.738 2 DEBUG nova.compute.manager [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-changed-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.738 2 DEBUG nova.compute.manager [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Refreshing instance network info cache due to event network-changed-82858700-6e07-4f6e-b7ae-45a35721505d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:34:55 np0005466012 nova_compute[192063]: 2025-10-02 12:34:55.738 2 DEBUG oslo_concurrency.lockutils [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.599 2 DEBUG nova.network.neutron [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updating instance_info_cache with network_info: [{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.674 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.674 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance network_info: |[{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.675 2 DEBUG oslo_concurrency.lockutils [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.675 2 DEBUG nova.network.neutron [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Refreshing network info cache for port 82858700-6e07-4f6e-b7ae-45a35721505d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.678 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Start _get_guest_xml network_info=[{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.682 2 WARNING nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.687 2 DEBUG nova.virt.libvirt.host [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.688 2 DEBUG nova.virt.libvirt.host [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.690 2 DEBUG nova.virt.libvirt.host [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.691 2 DEBUG nova.virt.libvirt.host [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.692 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.692 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.693 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.693 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.693 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.693 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.694 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.694 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.694 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.694 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.695 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.695 2 DEBUG nova.virt.hardware [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.699 2 DEBUG nova.virt.libvirt.vif [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508357503',display_name='tempest-TestNetworkAdvancedServerOps-server-1508357503',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508357503',id=155,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMh5YVc11xSdLclfb/KTn15iGTsatAtqMZ7/UObWZZ5Nty3g3yyO/+DJDP7MGesQP/RWjM47g+iXThApVJzS5WKw8zhlW4lZ6XTzYMT3T509KEtu4Fz7GVvmbxDEgScjfw==',key_name='tempest-TestNetworkAdvancedServerOps-1292664354',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-yyzzomqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:52Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=58a588a8-3fb2-484e-82f6-7c72285d22de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.699 2 DEBUG nova.network.os_vif_util [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.700 2 DEBUG nova.network.os_vif_util [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.701 2 DEBUG nova.objects.instance [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.835 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <uuid>58a588a8-3fb2-484e-82f6-7c72285d22de</uuid>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <name>instance-0000009b</name>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1508357503</nova:name>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:34:56</nova:creationTime>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:user uuid="1faa7e121a0e43ad8cb4ae5b2cfcc6a2">tempest-TestNetworkAdvancedServerOps-597114071-project-member</nova:user>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:project uuid="76c7dd40d83e4e3ca71abbebf57921b6">tempest-TestNetworkAdvancedServerOps-597114071</nova:project>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        <nova:port uuid="82858700-6e07-4f6e-b7ae-45a35721505d">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <entry name="serial">58a588a8-3fb2-484e-82f6-7c72285d22de</entry>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <entry name="uuid">58a588a8-3fb2-484e-82f6-7c72285d22de</entry>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.config"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:77:ce:fa"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <target dev="tap82858700-6e"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/console.log" append="off"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:34:56 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:34:56 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:34:56 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:34:56 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.836 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Preparing to wait for external event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.836 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.836 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.837 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.837 2 DEBUG nova.virt.libvirt.vif [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508357503',display_name='tempest-TestNetworkAdvancedServerOps-server-1508357503',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508357503',id=155,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMh5YVc11xSdLclfb/KTn15iGTsatAtqMZ7/UObWZZ5Nty3g3yyO/+DJDP7MGesQP/RWjM47g+iXThApVJzS5WKw8zhlW4lZ6XTzYMT3T509KEtu4Fz7GVvmbxDEgScjfw==',key_name='tempest-TestNetworkAdvancedServerOps-1292664354',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-yyzzomqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:52Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=58a588a8-3fb2-484e-82f6-7c72285d22de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.838 2 DEBUG nova.network.os_vif_util [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.838 2 DEBUG nova.network.os_vif_util [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.838 2 DEBUG os_vif [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.839 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82858700-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82858700-6e, col_values=(('external_ids', {'iface-id': '82858700-6e07-4f6e-b7ae-45a35721505d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:ce:fa', 'vm-uuid': '58a588a8-3fb2-484e-82f6-7c72285d22de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005466012 NetworkManager[51207]: <info>  [1759408496.8840] manager: (tap82858700-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.892 2 INFO os_vif [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e')#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.940 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.940 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.940 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] No VIF found with MAC fa:16:3e:77:ce:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:34:56 np0005466012 nova_compute[192063]: 2025-10-02 12:34:56.941 2 INFO nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Using config drive#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.332 2 INFO nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Creating config drive at /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.config#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.341 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbyiyoi1f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.473 2 DEBUG oslo_concurrency.processutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbyiyoi1f" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:57 np0005466012 kernel: tap82858700-6e: entered promiscuous mode
Oct  2 08:34:57 np0005466012 NetworkManager[51207]: <info>  [1759408497.5731] manager: (tap82858700-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:57Z|00618|binding|INFO|Claiming lport 82858700-6e07-4f6e-b7ae-45a35721505d for this chassis.
Oct  2 08:34:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:57Z|00619|binding|INFO|82858700-6e07-4f6e-b7ae-45a35721505d: Claiming fa:16:3e:77:ce:fa 10.100.0.9
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.598 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:ce:fa 10.100.0.9'], port_security=['fa:16:3e:77:ce:fa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cb51de21-4073-4a91-9994-dd0124090f6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=485509cf-159a-4a14-9aa8-dc0cdb38eb16, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=82858700-6e07-4f6e-b7ae-45a35721505d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.600 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 82858700-6e07-4f6e-b7ae-45a35721505d in datapath d6186c2e-33cd-4f99-8140-bddbba9d07d0 bound to our chassis#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.602 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6186c2e-33cd-4f99-8140-bddbba9d07d0#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.620 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e43bb419-8b83-4942-a4a7-a7f9a05b952c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.621 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6186c2e-31 in ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.622 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6186c2e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.622 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[062a553a-0713-4f52-b876-83ebcaf2233c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.623 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ded8b2b0-f803-47ce-a9ce-aa5c9af7a431]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.636 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[67957224-1847-47e4-b605-60407b1aff2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466012 systemd-udevd[245815]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:57Z|00620|binding|INFO|Setting lport 82858700-6e07-4f6e-b7ae-45a35721505d ovn-installed in OVS
Oct  2 08:34:57 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:57Z|00621|binding|INFO|Setting lport 82858700-6e07-4f6e-b7ae-45a35721505d up in Southbound
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.664 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cf550582-1bb7-48f7-9645-909d6a1a2a23]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 podman[245768]: 2025-10-02 12:34:57.664898115 +0000 UTC m=+0.110515849 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:34:57 np0005466012 podman[245769]: 2025-10-02 12:34:57.665588754 +0000 UTC m=+0.096058617 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:34:57 np0005466012 NetworkManager[51207]: <info>  [1759408497.6701] device (tap82858700-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:34:57 np0005466012 NetworkManager[51207]: <info>  [1759408497.6710] device (tap82858700-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:34:57 np0005466012 systemd-machined[152114]: New machine qemu-71-instance-0000009b.
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.698 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[eb220a37-8b78-4a56-a8cd-bbedb601d6cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 systemd[1]: Started Virtual Machine qemu-71-instance-0000009b.
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.702 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[247f3265-5763-4f59-a7ab-f67d45dd9269]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 NetworkManager[51207]: <info>  [1759408497.7038] manager: (tapd6186c2e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/288)
Oct  2 08:34:57 np0005466012 systemd-udevd[245818]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.706 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.707 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.727 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.741 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6f1bfc-ec94-4d27-aaf2-9eb33159b125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.744 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5dde78-3dce-4ed7-9de6-66816838ec22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:57 np0005466012 NetworkManager[51207]: <info>  [1759408497.7787] device (tapd6186c2e-30): carrier: link connected
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.785 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[17c195e3-e33b-4c7d-a47e-cd7971b3aed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.805 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[682263d4-1949-4418-b061-d7ce303de995]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6186c2e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:84:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649139, 'reachable_time': 38606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245848, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.826 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8ceef96b-59a4-4b12-8f17-92ba76f3b781]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:841b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 649139, 'tstamp': 649139}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245849, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.847 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a370e7cf-110c-4060-a48f-8e94168ebb95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6186c2e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:84:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649139, 'reachable_time': 38606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245850, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.881 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3e2d232c-8133-46bf-9a0d-19737db4aa4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.911 2 DEBUG nova.compute.manager [req-9585b578-4c52-4386-b575-0bb496ab4c06 req-16fe6b31-89f8-4ce2-8c26-2280c88326bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.911 2 DEBUG oslo_concurrency.lockutils [req-9585b578-4c52-4386-b575-0bb496ab4c06 req-16fe6b31-89f8-4ce2-8c26-2280c88326bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.912 2 DEBUG oslo_concurrency.lockutils [req-9585b578-4c52-4386-b575-0bb496ab4c06 req-16fe6b31-89f8-4ce2-8c26-2280c88326bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.913 2 DEBUG oslo_concurrency.lockutils [req-9585b578-4c52-4386-b575-0bb496ab4c06 req-16fe6b31-89f8-4ce2-8c26-2280c88326bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.913 2 DEBUG nova.compute.manager [req-9585b578-4c52-4386-b575-0bb496ab4c06 req-16fe6b31-89f8-4ce2-8c26-2280c88326bb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Processing event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.949 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6bb63e-5f48-4c5c-9ec1-956d2dc9858f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.951 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6186c2e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.951 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:34:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:57.951 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6186c2e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.972 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:57 np0005466012 nova_compute[192063]: 2025-10-02 12:34:57.973 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:58 np0005466012 NetworkManager[51207]: <info>  [1759408498.0450] manager: (tapd6186c2e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466012 kernel: tapd6186c2e-30: entered promiscuous mode
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:58.052 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6186c2e-30, col_values=(('external_ids', {'iface-id': '191808e4-38c4-4099-82ce-c40eaf416444'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:34:58 np0005466012 ovn_controller[94284]: 2025-10-02T12:34:58Z|00622|binding|INFO|Releasing lport 191808e4-38c4-4099-82ce-c40eaf416444 from this chassis (sb_readonly=0)
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.060 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.061 2 INFO nova.compute.claims [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:58.074 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6186c2e-33cd-4f99-8140-bddbba9d07d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6186c2e-33cd-4f99-8140-bddbba9d07d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:58.075 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[48ede06a-ffe4-4732-a914-6f1c3077715e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:58.076 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-d6186c2e-33cd-4f99-8140-bddbba9d07d0
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/d6186c2e-33cd-4f99-8140-bddbba9d07d0.pid.haproxy
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID d6186c2e-33cd-4f99-8140-bddbba9d07d0
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:34:58 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:34:58.076 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'env', 'PROCESS_TAG=haproxy-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6186c2e-33cd-4f99-8140-bddbba9d07d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.205 2 DEBUG nova.compute.provider_tree [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.223 2 DEBUG nova.scheduler.client.report [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.252 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.253 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.319 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.319 2 DEBUG nova.network.neutron [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.323 2 DEBUG nova.network.neutron [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updated VIF entry in instance network info cache for port 82858700-6e07-4f6e-b7ae-45a35721505d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.323 2 DEBUG nova.network.neutron [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updating instance_info_cache with network_info: [{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.341 2 INFO nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.344 2 DEBUG oslo_concurrency.lockutils [req-eadcfb0a-42f2-48d6-b54a-a337ebae2f80 req-43e8ccda-ae1f-4579-9897-20eecb4007a3 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.359 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.517 2 DEBUG nova.policy [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:34:58 np0005466012 podman[245886]: 2025-10-02 12:34:58.520538709 +0000 UTC m=+0.103743881 container create a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.524 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.525 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.526 2 INFO nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Creating image(s)#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.526 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.526 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.527 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:58 np0005466012 podman[245886]: 2025-10-02 12:34:58.439039546 +0000 UTC m=+0.022244778 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.538 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408498.5215423, 58a588a8-3fb2-484e-82f6-7c72285d22de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.538 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Started (Lifecycle Event)#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.540 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.541 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:58 np0005466012 systemd[1]: Started libpod-conmon-a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27.scope.
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.561 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.564 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.567 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.571 2 INFO nova.virt.libvirt.driver [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance spawned successfully.#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.572 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:34:58 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.587 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.588 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408498.5266638, 58a588a8-3fb2-484e-82f6-7c72285d22de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.588 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:34:58 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8429fd8cf6ceb90c34bd7d799adc9aa95b6067bc0253c0a5db529b102455c50c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.598 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.599 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.599 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:58 np0005466012 podman[245886]: 2025-10-02 12:34:58.60201236 +0000 UTC m=+0.185217552 container init a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:34:58 np0005466012 podman[245886]: 2025-10-02 12:34:58.60846328 +0000 UTC m=+0.191668462 container start a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.609 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.628 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.628 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.629 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.629 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.630 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.630 2 DEBUG nova.virt.libvirt.driver [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.633 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:58 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [NOTICE]   (245909) : New worker (245912) forked
Oct  2 08:34:58 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [NOTICE]   (245909) : Loading success.
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.639 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408498.5506625, 58a588a8-3fb2-484e-82f6-7c72285d22de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.639 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.662 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.670 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.672 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.672 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.698 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.703 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.704 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.704 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.725 2 INFO nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Took 6.09 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.726 2 DEBUG nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.758 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.758 2 DEBUG nova.virt.disk.api [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Checking if we can resize image /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.759 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.803 2 INFO nova.compute.manager [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Took 6.78 seconds to build instance.#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.816 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.817 2 DEBUG nova.virt.disk.api [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Cannot resize image /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.817 2 DEBUG nova.objects.instance [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'migration_context' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.820 2 DEBUG oslo_concurrency.lockutils [None req-679bedfe-8b9c-4811-a226-64dfe7d7fe6f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.828 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.828 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Ensure instance console log exists: /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.828 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.828 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:34:58 np0005466012 nova_compute[192063]: 2025-10-02 12:34:58.829 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:34:59 np0005466012 nova_compute[192063]: 2025-10-02 12:34:59.366 2 DEBUG nova.network.neutron [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Successfully created port: 37bcb93a-8639-42b7-aafd-21f019307d66 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.037 2 DEBUG nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.038 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.039 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.039 2 DEBUG oslo_concurrency.lockutils [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.040 2 DEBUG nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.040 2 WARNING nova.compute.manager [req-1b8f3de8-2e69-4e38-b3ac-5b5c28424a94 req-bd33c2ba-e793-4bd3-b55d-51dd5953f257 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state active and task_state None.#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.188 2 DEBUG nova.network.neutron [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Successfully updated port: 37bcb93a-8639-42b7-aafd-21f019307d66 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.205 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.205 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.206 2 DEBUG nova.network.neutron [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.393 2 DEBUG nova.compute.manager [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.394 2 DEBUG nova.compute.manager [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing instance network info cache due to event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.394 2 DEBUG oslo_concurrency.lockutils [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:00 np0005466012 nova_compute[192063]: 2025-10-02 12:35:00.453 2 DEBUG nova.network.neutron [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:35:01 np0005466012 podman[245933]: 2025-10-02 12:35:01.188404832 +0000 UTC m=+0.083137059 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:35:01 np0005466012 podman[245932]: 2025-10-02 12:35:01.188053751 +0000 UTC m=+0.082689566 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.249 2 DEBUG nova.network.neutron [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.278 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.278 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance network_info: |[{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.279 2 DEBUG oslo_concurrency.lockutils [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.279 2 DEBUG nova.network.neutron [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.282 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Start _get_guest_xml network_info=[{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.287 2 WARNING nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.294 2 DEBUG nova.virt.libvirt.host [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.294 2 DEBUG nova.virt.libvirt.host [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.305 2 DEBUG nova.virt.libvirt.host [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.306 2 DEBUG nova.virt.libvirt.host [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.308 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.308 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.309 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.309 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.309 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.310 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.310 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.310 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.310 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.311 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.311 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.311 2 DEBUG nova.virt.hardware [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.316 2 DEBUG nova.virt.libvirt.vif [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1978368192',display_name='tempest-TestShelveInstance-server-1978368192',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1978368192',id=156,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/XDGeJT9WLDU0HFLzjGkDPYXYzNnVTkBgasq3A2D12H9O5maW7G09qXMMNOwpxQcY9ezmdK5YuMVeh5Lmhul8cAhXsU4OmdH86TOpc/q67Xul+dL/ucyqS3TKQHf5rEA==',key_name='tempest-TestShelveInstance-537086882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='086ee425cb0949ab836e1b3ae489ced0',ramdisk_id='',reservation_id='r-a0s1hayn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1329865483',owner_user_name='tempest-TestShelveInstance-1329865483-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:58Z,user_data=None,user_id='81e456ca7bee486181b9c11ddb1f3ffd',uuid=1d931a6f-0703-4e1f-acfc-b8402834c14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.316 2 DEBUG nova.network.os_vif_util [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converting VIF {"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.317 2 DEBUG nova.network.os_vif_util [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.318 2 DEBUG nova.objects.instance [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.333 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <uuid>1d931a6f-0703-4e1f-acfc-b8402834c14d</uuid>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <name>instance-0000009c</name>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestShelveInstance-server-1978368192</nova:name>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:35:01</nova:creationTime>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:user uuid="81e456ca7bee486181b9c11ddb1f3ffd">tempest-TestShelveInstance-1329865483-project-member</nova:user>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:project uuid="086ee425cb0949ab836e1b3ae489ced0">tempest-TestShelveInstance-1329865483</nova:project>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        <nova:port uuid="37bcb93a-8639-42b7-aafd-21f019307d66">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <entry name="serial">1d931a6f-0703-4e1f-acfc-b8402834c14d</entry>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <entry name="uuid">1d931a6f-0703-4e1f-acfc-b8402834c14d</entry>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:3b:d9:00"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <target dev="tap37bcb93a-86"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/console.log" append="off"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:35:01 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:35:01 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:35:01 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:35:01 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.334 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Preparing to wait for external event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.335 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.335 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.335 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.336 2 DEBUG nova.virt.libvirt.vif [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1978368192',display_name='tempest-TestShelveInstance-server-1978368192',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1978368192',id=156,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/XDGeJT9WLDU0HFLzjGkDPYXYzNnVTkBgasq3A2D12H9O5maW7G09qXMMNOwpxQcY9ezmdK5YuMVeh5Lmhul8cAhXsU4OmdH86TOpc/q67Xul+dL/ucyqS3TKQHf5rEA==',key_name='tempest-TestShelveInstance-537086882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='086ee425cb0949ab836e1b3ae489ced0',ramdisk_id='',reservation_id='r-a0s1hayn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1329865483',owner_user_name='tempest-TestShelveInstance-1329865483-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:34:58Z,user_data=None,user_id='81e456ca7bee486181b9c11ddb1f3ffd',uuid=1d931a6f-0703-4e1f-acfc-b8402834c14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.336 2 DEBUG nova.network.os_vif_util [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converting VIF {"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.337 2 DEBUG nova.network.os_vif_util [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.337 2 DEBUG os_vif [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.339 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37bcb93a-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37bcb93a-86, col_values=(('external_ids', {'iface-id': '37bcb93a-8639-42b7-aafd-21f019307d66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:d9:00', 'vm-uuid': '1d931a6f-0703-4e1f-acfc-b8402834c14d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:01 np0005466012 NetworkManager[51207]: <info>  [1759408501.3897] manager: (tap37bcb93a-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.396 2 INFO os_vif [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86')#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.654 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.655 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.655 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] No VIF found with MAC fa:16:3e:3b:d9:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:35:01 np0005466012 nova_compute[192063]: 2025-10-02 12:35:01.655 2 INFO nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Using config drive#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.151 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.152 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.152 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.422 2 INFO nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Creating config drive at /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.432 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42thd321 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.4716] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.4731] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.567 2 DEBUG oslo_concurrency.processutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42thd321" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.6326] manager: (tap37bcb93a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Oct  2 08:35:02 np0005466012 kernel: tap37bcb93a-86: entered promiscuous mode
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:02 np0005466012 systemd-udevd[245996]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:02Z|00623|binding|INFO|Claiming lport 37bcb93a-8639-42b7-aafd-21f019307d66 for this chassis.
Oct  2 08:35:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:02Z|00624|binding|INFO|37bcb93a-8639-42b7-aafd-21f019307d66: Claiming fa:16:3e:3b:d9:00 10.100.0.9
Oct  2 08:35:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:02Z|00625|binding|INFO|Releasing lport 191808e4-38c4-4099-82ce-c40eaf416444 from this chassis (sb_readonly=0)
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.6813] device (tap37bcb93a-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.6823] device (tap37bcb93a-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:02 np0005466012 systemd-machined[152114]: New machine qemu-72-instance-0000009c.
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.692 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:d9:00 10.100.0.9'], port_security=['fa:16:3e:3b:d9:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1f1562c-389b-4488-b13e-0f3594ca916b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '086ee425cb0949ab836e1b3ae489ced0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e9c6c044-9bae-451d-9ac4-f29a1af96360', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23273790-9180-40d0-a3ca-fdfdfd7f3c59, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=37bcb93a-8639-42b7-aafd-21f019307d66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.694 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcb93a-8639-42b7-aafd-21f019307d66 in datapath a1f1562c-389b-4488-b13e-0f3594ca916b bound to our chassis#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.696 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1f1562c-389b-4488-b13e-0f3594ca916b#033[00m
Oct  2 08:35:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:02Z|00626|binding|INFO|Setting lport 37bcb93a-8639-42b7-aafd-21f019307d66 ovn-installed in OVS
Oct  2 08:35:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:02Z|00627|binding|INFO|Setting lport 37bcb93a-8639-42b7-aafd-21f019307d66 up in Southbound
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:02 np0005466012 systemd[1]: Started Virtual Machine qemu-72-instance-0000009c.
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.708 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8acfc9f2-f77a-4b48-93f2-cb7c8a652df9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.709 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1f1562c-31 in ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.711 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1f1562c-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.711 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e9eec957-cd99-4103-997e-43b3b0118ad6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.712 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9cca1f2d-3e69-4fd0-aad5-aa0db38fc7d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.726 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[b5df188e-460f-4f9a-a4e1-d678d94aeb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.751 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0058872a-8e65-4345-b2b1-4c1086b51e78]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.805 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5a3458-83d9-44be-8ac1-52fe51d88340]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.8147] manager: (tapa1f1562c-30): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.818 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[39d26562-6dd1-4e00-b0ed-f44b210883af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.854 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e020e9aa-3d09-4d5e-b1a0-b3a589afd5d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.860 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[743fd69a-fe88-4378-b084-ae0b0dc66f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 NetworkManager[51207]: <info>  [1759408502.8853] device (tapa1f1562c-30): carrier: link connected
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.893 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3c20dd5f-8798-4dbe-83a6-a2810721efcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.911 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4cfbe4ac-15c7-4f78-9b63-beeb4dbdf057]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1f1562c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e4:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649649, 'reachable_time': 42842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246030, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.925 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3f09f086-d640-49e9-a8bd-fa9a4ddd84e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e47a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 649649, 'tstamp': 649649}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246031, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.943 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[466ce570-b54b-41f4-b7b8-a1cce1a5da53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1f1562c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e4:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649649, 'reachable_time': 42842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246032, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:02.974 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[adcb8b9a-3b2b-40ac-87bf-cf4a41743cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.986 2 DEBUG nova.compute.manager [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-changed-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.987 2 DEBUG nova.compute.manager [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Refreshing instance network info cache due to event network-changed-82858700-6e07-4f6e-b7ae-45a35721505d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.988 2 DEBUG oslo_concurrency.lockutils [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.989 2 DEBUG oslo_concurrency.lockutils [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.989 2 DEBUG nova.network.neutron [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Refreshing network info cache for port 82858700-6e07-4f6e-b7ae-45a35721505d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.991 2 DEBUG nova.network.neutron [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updated VIF entry in instance network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.991 2 DEBUG nova.network.neutron [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.996 2 DEBUG nova.compute.manager [req-0648ecc7-2b41-466a-98c0-0b9769db649d req-9be4726b-fa2a-412f-bb24-95d0afec6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.996 2 DEBUG oslo_concurrency.lockutils [req-0648ecc7-2b41-466a-98c0-0b9769db649d req-9be4726b-fa2a-412f-bb24-95d0afec6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.997 2 DEBUG oslo_concurrency.lockutils [req-0648ecc7-2b41-466a-98c0-0b9769db649d req-9be4726b-fa2a-412f-bb24-95d0afec6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.997 2 DEBUG oslo_concurrency.lockutils [req-0648ecc7-2b41-466a-98c0-0b9769db649d req-9be4726b-fa2a-412f-bb24-95d0afec6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:02 np0005466012 nova_compute[192063]: 2025-10-02 12:35:02.997 2 DEBUG nova.compute.manager [req-0648ecc7-2b41-466a-98c0-0b9769db649d req-9be4726b-fa2a-412f-bb24-95d0afec6588 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Processing event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.023 2 DEBUG oslo_concurrency.lockutils [req-2ca75b0e-766e-4e02-85a0-e7dfa70fac84 req-442ec3e3-3061-4dc8-a705-4a6506485866 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.050 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[507924c7-382a-404b-97b1-242b5434e8f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.053 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1f1562c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.054 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.055 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1f1562c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:03 np0005466012 NetworkManager[51207]: <info>  [1759408503.0575] manager: (tapa1f1562c-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Oct  2 08:35:03 np0005466012 kernel: tapa1f1562c-30: entered promiscuous mode
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.067 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1f1562c-30, col_values=(('external_ids', {'iface-id': '765813dd-4eb1-46b7-adc3-4b198fc4dbfb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:03Z|00628|binding|INFO|Releasing lport 765813dd-4eb1-46b7-adc3-4b198fc4dbfb from this chassis (sb_readonly=0)
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.070 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1f1562c-389b-4488-b13e-0f3594ca916b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1f1562c-389b-4488-b13e-0f3594ca916b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.070 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[70576551-6acb-46fe-8058-53dc46b77e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.071 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-a1f1562c-389b-4488-b13e-0f3594ca916b
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/a1f1562c-389b-4488-b13e-0f3594ca916b.pid.haproxy
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID a1f1562c-389b-4488-b13e-0f3594ca916b
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:35:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:03.071 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'env', 'PROCESS_TAG=haproxy-a1f1562c-389b-4488-b13e-0f3594ca916b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1f1562c-389b-4488-b13e-0f3594ca916b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.368 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.389 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Triggering sync for uuid 58a588a8-3fb2-484e-82f6-7c72285d22de _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.390 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Triggering sync for uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.390 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.391 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.391 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.412 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:03 np0005466012 podman[246070]: 2025-10-02 12:35:03.511887624 +0000 UTC m=+0.054514774 container create 37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:35:03 np0005466012 systemd[1]: Started libpod-conmon-37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42.scope.
Oct  2 08:35:03 np0005466012 podman[246070]: 2025-10-02 12:35:03.489481272 +0000 UTC m=+0.032108452 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.584 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408503.5844486, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.585 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Started (Lifecycle Event)#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.588 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.591 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.595 2 INFO nova.virt.libvirt.driver [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance spawned successfully.#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.595 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:35:03 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.622 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:03 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3276e65bb91dcc9d2d4cddb8128fe787645d491b4eec5777b1ef9e89f989e456/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.629 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.630 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.631 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.631 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.631 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.632 2 DEBUG nova.virt.libvirt.driver [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.637 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:03 np0005466012 podman[246070]: 2025-10-02 12:35:03.648400224 +0000 UTC m=+0.191027434 container init 37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct  2 08:35:03 np0005466012 podman[246070]: 2025-10-02 12:35:03.65979468 +0000 UTC m=+0.202421840 container start 37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.674 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.675 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408503.5845702, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.676 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:35:03 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [NOTICE]   (246088) : New worker (246090) forked
Oct  2 08:35:03 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [NOTICE]   (246088) : Loading success.
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.696 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.700 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408503.592984, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.700 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.721 2 INFO nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Took 5.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.722 2 DEBUG nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.728 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.737 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.784 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.838 2 INFO nova.compute.manager [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Took 5.91 seconds to build instance.#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.854 2 DEBUG oslo_concurrency.lockutils [None req-c1c1b6ed-4f75-4060-92ce-3f3166309981 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.855 2 INFO nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:35:03 np0005466012 nova_compute[192063]: 2025-10-02 12:35:03.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.115 2 DEBUG nova.compute.manager [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.116 2 DEBUG oslo_concurrency.lockutils [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.118 2 DEBUG oslo_concurrency.lockutils [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.118 2 DEBUG oslo_concurrency.lockutils [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.119 2 DEBUG nova.compute.manager [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] No waiting events found dispatching network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.120 2 WARNING nova.compute.manager [req-62615ab8-ee77-434d-a41c-c2f004d13017 req-dd1d8839-f9e4-4087-b3e3-5faa7369bc1c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received unexpected event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.969 2 DEBUG nova.network.neutron [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updated VIF entry in instance network info cache for port 82858700-6e07-4f6e-b7ae-45a35721505d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.971 2 DEBUG nova.network.neutron [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updating instance_info_cache with network_info: [{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:05 np0005466012 nova_compute[192063]: 2025-10-02 12:35:05.995 2 DEBUG oslo_concurrency.lockutils [req-6093a069-bbd2-4e0d-ad9e-1b9f48ad718d req-df5ded13-5b58-4380-9c0a-96f15db410b8 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:06 np0005466012 nova_compute[192063]: 2025-10-02 12:35:06.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:07 np0005466012 nova_compute[192063]: 2025-10-02 12:35:07.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:11Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:ce:fa 10.100.0.9
Oct  2 08:35:11 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:11Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:ce:fa 10.100.0.9
Oct  2 08:35:11 np0005466012 nova_compute[192063]: 2025-10-02 12:35:11.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.470 2 DEBUG nova.compute.manager [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.471 2 DEBUG nova.compute.manager [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing instance network info cache due to event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.472 2 DEBUG oslo_concurrency.lockutils [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.472 2 DEBUG oslo_concurrency.lockutils [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.472 2 DEBUG nova.network.neutron [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:35:12 np0005466012 nova_compute[192063]: 2025-10-02 12:35:12.840 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:35:14 np0005466012 nova_compute[192063]: 2025-10-02 12:35:14.168 2 DEBUG nova.network.neutron [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updated VIF entry in instance network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:14 np0005466012 nova_compute[192063]: 2025-10-02 12:35:14.169 2 DEBUG nova.network.neutron [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:14 np0005466012 nova_compute[192063]: 2025-10-02 12:35:14.188 2 DEBUG oslo_concurrency.lockutils [req-0b609814-fe50-4097-8878-cdf1fce50b60 req-646cf0c4-263b-45bd-bc0b-31cece94d2a6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:15 np0005466012 podman[246129]: 2025-10-02 12:35:15.152117127 +0000 UTC m=+0.063272867 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:35:15 np0005466012 podman[246130]: 2025-10-02 12:35:15.210977111 +0000 UTC m=+0.120253710 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  2 08:35:15 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:15Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:d9:00 10.100.0.9
Oct  2 08:35:15 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:15Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:d9:00 10.100.0.9
Oct  2 08:35:15 np0005466012 nova_compute[192063]: 2025-10-02 12:35:15.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:15.284 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:15.286 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:35:15 np0005466012 nova_compute[192063]: 2025-10-02 12:35:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.230 2 INFO nova.compute.manager [None req-53475a88-7db3-474b-a3f3-453767c0fe7f 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Get console output#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.239 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.677 2 DEBUG nova.objects.instance [None req-7351cf35-59b9-4b50-8396-f10b7aa937eb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.704 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408516.703829, 58a588a8-3fb2-484e-82f6-7c72285d22de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.705 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.731 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.736 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:16 np0005466012 nova_compute[192063]: 2025-10-02 12:35:16.761 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.928 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'name': 'tempest-TestShelveInstance-server-1978368192', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '086ee425cb0949ab836e1b3ae489ced0', 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'hostId': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.930 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'hostId': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.953 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.read.bytes volume: 30521856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.954 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.975 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.read.bytes volume: 31025664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.975 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2ed2d0b-61a6-4380-bbaa-711491110cff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30521856, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:16.931243', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '402eea92-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': 'e0cb77aa8368c408367a6c210fba98a55c4308bb42901e1aeb7bbb94895ec3e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:16.931243', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '402ef532-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': 'ae273afe45e87924b18de9bf923614a32b9fe1bfe30f96b85cba9bfac6514723'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31025664, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:16.931243', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4032335a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '51949f9d8c2138e567bd3ded69896e3eb955722dbef4bec94303e639c0059879'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:16.931243', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40324354-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '746bd46ca75e0e8c10a4058fec3cc37b6a6b6545ecdce771b6eaf1e096d0a4dd'}]}, 'timestamp': '2025-10-02 12:35:16.976061', '_unique_id': '40e944526014426d8b80188548a31660'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.977 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.980 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1d931a6f-0703-4e1f-acfc-b8402834c14d / tap37bcb93a-86 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.981 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.983 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 58a588a8-3fb2-484e-82f6-7c72285d22de / tap82858700-6e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.984 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95127976-eb1f-487a-bf71-c1e5c07a0a58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:16.978632', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '40331c5c-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': 'b5b15fc1106c5963a3326c45ee4b3e4b8827e964aac42beb94d8b78786ba02a6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:16.978632', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '40339178-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': '3a8e31dca4f18179dc88c7d41fab6635167a811f5b246a30ce643cedd86c711e'}]}, 'timestamp': '2025-10-02 12:35:16.984632', '_unique_id': '002d5fbe6b284c11b6e9f671bf468981'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.985 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.986 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.986 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>]
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.987 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.987 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.incoming.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f43a42c6-d2a9-416d-88b1-9092f3a64a58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:16.987293', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '40340946-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': '6bc1a32ac48165a248d141851549080e47f2fc7676ea1f351f75082f2df70ec5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:16.987293', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '403417ba-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': 'd11ef03ba1a5f89c3d828008fde85617da77a1cee539fb47f79ae9c6137dd51e'}]}, 'timestamp': '2025-10-02 12:35:16.988054', '_unique_id': '355de5aaf5854b92abbb927f2bf54815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.988 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.989 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.990 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11f9dd74-5da5-4625-b43f-df756c19a66f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:16.989773', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '403467a6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': 'adedb26517f672b92023d0514a92e88a1e4ebca06e3993e79fc7c435e41b1336'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:16.989773', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '40347336-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': '85823e46cb5ff5d59982285f1c010a3fe163aa4d1139d1911bb8140652da5353'}]}, 'timestamp': '2025-10-02 12:35:16.990386', '_unique_id': '5aa204088429461facff10fb09c0e3eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.991 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.992 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.992 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>]
Oct  2 08:35:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:16.992 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.008 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/cpu volume: 11780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.025 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/cpu volume: 11330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4c5fbfb-7e48-4789-bf72-4eac77b74266', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11780000000, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'timestamp': '2025-10-02T12:35:16.992411', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '40374b4c-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.684924355, 'message_signature': '27ed9274f44225e386eac4bb6678fe6c9b31613148e763b983a6d669eb1bc1c9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11330000000, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'timestamp': '2025-10-02T12:35:16.992411', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4039e4e2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.701826974, 'message_signature': '3d6600e77f783c2a3b9a934aa3d3f57ae30257032eff87ec556cd88838c584b8'}]}, 'timestamp': '2025-10-02 12:35:17.026184', '_unique_id': 'f166d5f488d5434188c11e89543af875'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.027 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.029 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.incoming.bytes volume: 1874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.030 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.incoming.bytes volume: 4885 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fb3056b-8102-4ab9-9642-e4b9d8d16884', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1874, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.029164', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '403a76be-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': 'cb8d9f7311190ab0a6bf2280d932bc72107fd3ed139963d7d614a6383fea25f0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4885, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.029164', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '403a9108-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': 'd356473bf78169c577ed66b08c954b81a64c5517142b2fac1d22376d94a452f1'}]}, 'timestamp': '2025-10-02 12:35:17.030564', '_unique_id': '397a60744c244a19b1ee84fe8bc7f1ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.033 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.write.latency volume: 1616729084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.033 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.034 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.write.latency volume: 1884792061 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.034 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31b8f5d3-23ce-4009-8141-3a346f11469e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1616729084, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.033029', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '403b0728-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': 'e09ed152cc21066d6b52fc95d9a87b54cdc5202d7b4b1b229ec2e231fb069383'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.033029', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '403b1f88-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '842133348cf687352a6606f589a010a5ce6db9afa1fa4d4dadb09cbbc4898bb4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1884792061, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.033029', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '403b3536-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '82290055f53c82c93f87e100c00711c09bfb08f65add5e1c412fde6826352653'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.033029', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '403b4ce2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '98646e1aebb89b135fa32138cbb4cb5e6f1d8b21629f22cdeb0fede28038aea4'}]}, 'timestamp': '2025-10-02 12:35:17.035358', '_unique_id': 'f5d94f0fa96c401284acf73107686a45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.036 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.038 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.038 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.039 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.040 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8a3e211-eccb-487a-a95c-f9596408ce7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.037998', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '403bc99c-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '79d5155594099d2be51ac623151fdabc2a4e640542277d15fb14e92442585d43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.037998', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '403bf908-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '7af4fba4cfcf89f550276b8fae4853974099abb3422d741c189461b41401e739'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.037998', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '403c0fba-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': 'dddc38baf5dafa09c5e6a55ef30e396c6de9849f6ab657aec34515fe0e9f81b9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.037998', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '403c2356-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '00f76b8ba372ee427ecd12a0125a65ed2c1c56f75e0112cc4674e1ecd8f50792'}]}, 'timestamp': '2025-10-02 12:35:17.040889', '_unique_id': '7f3cd1e5d1cc4122a13d65aaf035b632'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.042 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.056 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.057 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.067 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.068 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42eb7533-f51a-4e35-aa07-41564388938d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.043500', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '403ea860-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.719964007, 'message_signature': '1ab21e198efc2f4f8799ea5156926d9aa5c0958810e76f783169f188ccd55e61'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.043500', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '403eb58a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.719964007, 'message_signature': '1024259710dcbff254f7b9cf266fb5966e5c6b05e941fd01e2ca1f7e28cf2033'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.043500', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '40406128-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.733964206, 'message_signature': '07d65cb50c14ad51cc17178ce05bf066f092e997b7eb34e3ce7d1e3882030e77'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.043500', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404077b2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.733964206, 'message_signature': '4a2cb11d4bf2a1e47e3064681a9bf079cf48418b16530b604207fb46c2366d68'}]}, 'timestamp': '2025-10-02 12:35:17.069279', '_unique_id': '69f4dc8cf5814261af56ed02231afb0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.070 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.072 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.072 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.write.requests volume: 302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.072 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.073 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.073 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '834caaa8-47b4-49d8-9a43-35a2b1f9ead4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 302, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.072385', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '40410704-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '3da247ef43ff9507ef1dae12b2fb943c3032d885239ac4cf27699810ba5d4554'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.072385', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40411e42-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '98501d1aa83b9a8d1ddefc8b63bf8ec4deeefcc7c0d8b804ac1336b2b44e6baa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.072385', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404131c0-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '533c0178f84f9468b6b490a20c05c61ba2f84193a4608648b2ff1b5af25db9a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.072385', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40414250-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '46a767d91ab019e81ca26af45cbc1197363e2b5437aa534302b3d5efeffc19cd'}]}, 'timestamp': '2025-10-02 12:35:17.074421', '_unique_id': 'd2f2e2f7925f46f5872400f6847c2e91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.075 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.077 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.read.latency volume: 599290251 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.077 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.read.latency volume: 49796697 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.078 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.read.latency volume: 632339736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.078 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.read.latency volume: 39528685 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08e2dab0-9633-4ab5-80c7-0a028e0bd62c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 599290251, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.077288', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4041c6b2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '734a099ab3e1d147ae4ba1d496e56267d488bedc2bb368723d29eb7b2d10176a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49796697, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.077288', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4041da26-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '36c5c0a7f91878c0324d755d7a25892d27acbc781aad35ff6640cffccafa11e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 632339736, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.077288', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4041ea3e-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '5a8a373a0fed7e5a8eface8e24b7749e7f7bd2c0b970f11d5467a75add9315c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39528685, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.077288', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40420050-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': 'cb3031cc8967a11c7ea24191e765a00b27eaae5f7c8ffe16668532bd369d85c8'}]}, 'timestamp': '2025-10-02 12:35:17.079251', '_unique_id': '43bc996b138d487684a486393cd83c64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.080 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>]
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.082 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.083 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ea3ea17-fb61-426a-b33a-4550f49fee98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.082730', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '40429d76-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': 'c36e636c6fa1a5fa73413877affe66e105e35dfa9b15bdce7da8be20335d8f78'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.082730', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '4042adc0-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': 'fbfac627f60425179fb46ea03aa626776fae3742842b58a9098b87d2256fbbec'}]}, 'timestamp': '2025-10-02 12:35:17.083634', '_unique_id': 'd971e026f2f64257bb3cf31b286039c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.084 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.085 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.085 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/memory.usage volume: 40.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.085 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e103070-f959-45e8-9fd2-b50bcc0091d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41015625, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'timestamp': '2025-10-02T12:35:17.085348', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4042fca8-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.684924355, 'message_signature': '8d9c5d7bc9ea298aa20a23ffebd77e2199e5f644058f54eba4b525f0b8dfd96d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'timestamp': '2025-10-02T12:35:17.085348', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '404308e2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.701826974, 'message_signature': '01835df73a4656a0929a916827843d0ab572fb4f1b2ee71eeb3a86ff2de56394'}]}, 'timestamp': '2025-10-02 12:35:17.085954', '_unique_id': '339f343ddf074d68967409f2d0612f56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.086 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.087 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.087 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f421c1c-409f-4bac-8e99-4161c9c26a07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.087504', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '40435572-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': '353bd20816e4344b6b48ae8417c241034c5584faa52e56e00f482e2b561ba029'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.087504', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '40436198-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': '453df73532a8bfa3f16382596f607ddf09bfd620bda5d77574df120ceba66fe5'}]}, 'timestamp': '2025-10-02 12:35:17.088233', '_unique_id': '811363c048c9442ca20bd4654e763696'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.088 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.089 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.090 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.090 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.090 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47978305-59dc-4bd6-a099-f509aff1294b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.089866', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4043ad1a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.719964007, 'message_signature': 'aa2bc40bc4825484fcb4d78e9421dcc98c04c64b538a55cb7e35265ef8a88788'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.089866', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4043b7e2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.719964007, 'message_signature': '45300e43c959724e390e933fc6cf8b08a3f0416c181350e107f5df3ab204e125'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.089866', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4043c228-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.733964206, 'message_signature': 'bc618eb80dd729bb0c5c3b351a6db99bdf3556bd801d0aa8895592fd6c43258c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.089866', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4043ce30-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.733964206, 'message_signature': '2c5e9e62bd5bcbb12bfd51dd0821d8f3fb73131080eefc0da557495f706306d7'}]}, 'timestamp': '2025-10-02 12:35:17.091008', '_unique_id': 'a81f3351ccdb4f80b79f8f0184b2fc8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.091 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.092 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.092 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.093 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.093 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '938ced65-8d2d-482c-9e41-d517e5f15407', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.092595', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404418ae-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '8ed281943479ed1417f1696066af7c9dad3170ece4fea63e2928633b2367f7a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.092595', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404423a8-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.607604338, 'message_signature': '8d3ca46b9f8139932f27f34c31983f7ada8790153168ec1dfd616341c779f9bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.092595', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '40442dee-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '6763b1b3c2fa260c80726587c7eef91a328ab56855579d7982902cd9e1bb43d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.092595', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404437e4-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.630690388, 'message_signature': '72afd50526344a698961927d3a36f16304c6125650cd8f69936f18aceb590488'}]}, 'timestamp': '2025-10-02 12:35:17.093736', '_unique_id': '7c2c9c8550b74cab99ecee59dbe3f0c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.094 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.095 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.095 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71dcffdc-bf6c-4ef9-a939-562fd53a4cbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.095369', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '40448456-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': '8d6af6d59f2d711be430f9574a20670909fcb7751a0cf7e0550f44ce83a1f48e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.095369', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '404490e0-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': '386e27c1e3f0b10d75c01526489f1a9aea65e77d4cd9b69e6f93a0c3d9c04fc6'}]}, 'timestamp': '2025-10-02 12:35:17.095998', '_unique_id': '2ffb58d6ca2b4dc7829ecff42df546d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.096 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.098 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.098 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.098 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.098 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ee2fbe8-1e03-47df-88c6-046c28c2568b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-vda', 'timestamp': '2025-10-02T12:35:17.097987', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4044eaf4-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.719964007, 'message_signature': 'b0171a43b41c2d253459db74957b5281de4a96a30e3ff9f5bffe075819e37a96'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d-sda', 'timestamp': '2025-10-02T12:35:17.097987', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'instance-0000009c', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4044f684-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.719964007, 'message_signature': 'c9473ef827e10fbc4739b6c09988db8e9237f7d40c23c4d2d008f6017b250208'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-vda', 'timestamp': '2025-10-02T12:35:17.097987', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404501f6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.733964206, 'message_signature': '34abe32bce5b919220415b815e664f4a0e6e12b0b0ec1745de78ae3192f2b367'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': '58a588a8-3fb2-484e-82f6-7c72285d22de-sda', 'timestamp': '2025-10-02T12:35:17.097987', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'instance-0000009b', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40450d7c-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.733964206, 'message_signature': '1f277fe36e3c0d6dce6ddf961106fca6cdfbb499e50f3ff65890554d3f214a6d'}]}, 'timestamp': '2025-10-02 12:35:17.099176', '_unique_id': '22282d53ca6a455fbce69e9e95cb9a13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.100 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.101 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.outgoing.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ec21a3-ce75-46b6-a209-122c51ebe5ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.100921', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '40455cfa-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': '9324d9e3ee61046978c7a86b44b8099014442ba785186ace00bcc7869c061e80'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.100921', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '40456844-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': 'decd271aab6965631c1527bace2a603c2de29bb5e355e5f0dc9de0d5db79295a'}]}, 'timestamp': '2025-10-02 12:35:17.101509', '_unique_id': '69081d1192904e8c8d2fdbc9f3c59d1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.103 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.103 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-1978368192>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508357503>]
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.103 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.103 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd2d192e-916f-474a-8a5a-7911362e8226', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.103452', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '4045bfd8-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': 'e61bf2e0487bdd619e5b54867232230b5292b777b4705d582f85be68cdcf351e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.103452', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '4045cc62-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': '620465ac7ba2365541eb2a8a5909dc4232d77dad9a72ee7511b5d21baf060ed5'}]}, 'timestamp': '2025-10-02 12:35:17.104075', '_unique_id': 'e238aef0c4a04ef1a0bf184154b707e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.104 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.105 12 DEBUG ceilometer.compute.pollsters [-] 1d931a6f-0703-4e1f-acfc-b8402834c14d/network.outgoing.bytes volume: 1194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.105 12 DEBUG ceilometer.compute.pollsters [-] 58a588a8-3fb2-484e-82f6-7c72285d22de/network.outgoing.bytes volume: 3306 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '754c5aff-ae02-4b18-b681-69af9d750984', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1194, 'user_id': '81e456ca7bee486181b9c11ddb1f3ffd', 'user_name': None, 'project_id': '086ee425cb0949ab836e1b3ae489ced0', 'project_name': None, 'resource_id': 'instance-0000009c-1d931a6f-0703-4e1f-acfc-b8402834c14d-tap37bcb93a-86', 'timestamp': '2025-10-02T12:35:17.105574', 'resource_metadata': {'display_name': 'tempest-TestShelveInstance-server-1978368192', 'name': 'tap37bcb93a-86', 'instance_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'instance_type': 'm1.nano', 'host': 'a89bda1537ab35d7446cf6626873ed5449746bbdbed590eab474c2f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:d9:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap37bcb93a-86'}, 'message_id': '4046137a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.655051295, 'message_signature': '9a5a1a7c95a9f8c0bf213d2d0197144e4a77ea75a68797f944dfcb82805a5640'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3306, 'user_id': '1faa7e121a0e43ad8cb4ae5b2cfcc6a2', 'user_name': None, 'project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'project_name': None, 'resource_id': 'instance-0000009b-58a588a8-3fb2-484e-82f6-7c72285d22de-tap82858700-6e', 'timestamp': '2025-10-02T12:35:17.105574', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508357503', 'name': 'tap82858700-6e', 'instance_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'instance_type': 'm1.nano', 'host': '5dd65cab4fa0463792d4aa4c310490c11e800dee5b5466c984a3ba12', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82858700-6e'}, 'message_id': '40461f00-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6510.658043798, 'message_signature': 'ecf42b9ef0027b6337fe70ed117a865458d36419acb8e54061da0926bf091642'}]}, 'timestamp': '2025-10-02 12:35:17.106188', '_unique_id': '6d86df111aa6409a9a81b7f722717faf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:35:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:35:17.106 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:35:17 np0005466012 kernel: tap82858700-6e (unregistering): left promiscuous mode
Oct  2 08:35:17 np0005466012 NetworkManager[51207]: <info>  [1759408517.4272] device (tap82858700-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:35:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:17Z|00629|binding|INFO|Releasing lport 82858700-6e07-4f6e-b7ae-45a35721505d from this chassis (sb_readonly=0)
Oct  2 08:35:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:17Z|00630|binding|INFO|Setting lport 82858700-6e07-4f6e-b7ae-45a35721505d down in Southbound
Oct  2 08:35:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:17Z|00631|binding|INFO|Removing iface tap82858700-6e ovn-installed in OVS
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.442 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:ce:fa 10.100.0.9'], port_security=['fa:16:3e:77:ce:fa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb51de21-4073-4a91-9994-dd0124090f6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=485509cf-159a-4a14-9aa8-dc0cdb38eb16, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=82858700-6e07-4f6e-b7ae-45a35721505d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.443 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 82858700-6e07-4f6e-b7ae-45a35721505d in datapath d6186c2e-33cd-4f99-8140-bddbba9d07d0 unbound from our chassis#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.445 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6186c2e-33cd-4f99-8140-bddbba9d07d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.447 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb74a63e-69f8-41ef-a8ce-aa50a16c525f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.447 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 namespace which is not needed anymore#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005466012 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Oct  2 08:35:17 np0005466012 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009b.scope: Consumed 13.040s CPU time.
Oct  2 08:35:17 np0005466012 systemd-machined[152114]: Machine qemu-71-instance-0000009b terminated.
Oct  2 08:35:17 np0005466012 podman[246182]: 2025-10-02 12:35:17.53903342 +0000 UTC m=+0.087836300 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:35:17 np0005466012 podman[246183]: 2025-10-02 12:35:17.547232867 +0000 UTC m=+0.090302087 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:35:17 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [NOTICE]   (245909) : haproxy version is 2.8.14-c23fe91
Oct  2 08:35:17 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [NOTICE]   (245909) : path to executable is /usr/sbin/haproxy
Oct  2 08:35:17 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [WARNING]  (245909) : Exiting Master process...
Oct  2 08:35:17 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [ALERT]    (245909) : Current worker (245912) exited with code 143 (Terminated)
Oct  2 08:35:17 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[245903]: [WARNING]  (245909) : All workers exited. Exiting... (0)
Oct  2 08:35:17 np0005466012 systemd[1]: libpod-a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27.scope: Deactivated successfully.
Oct  2 08:35:17 np0005466012 podman[246239]: 2025-10-02 12:35:17.591974549 +0000 UTC m=+0.051151200 container died a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:17 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27-userdata-shm.mount: Deactivated successfully.
Oct  2 08:35:17 np0005466012 systemd[1]: var-lib-containers-storage-overlay-8429fd8cf6ceb90c34bd7d799adc9aa95b6067bc0253c0a5db529b102455c50c-merged.mount: Deactivated successfully.
Oct  2 08:35:17 np0005466012 podman[246239]: 2025-10-02 12:35:17.632505795 +0000 UTC m=+0.091682386 container cleanup a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:35:17 np0005466012 systemd[1]: libpod-conmon-a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27.scope: Deactivated successfully.
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.654 2 DEBUG nova.compute.manager [None req-7351cf35-59b9-4b50-8396-f10b7aa937eb 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:17 np0005466012 podman[246284]: 2025-10-02 12:35:17.693132487 +0000 UTC m=+0.038512050 container remove a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.698 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e6112906-1be6-4d2d-a707-b97bc22aa5aa]: (4, ('Thu Oct  2 12:35:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 (a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27)\na19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27\nThu Oct  2 12:35:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 (a19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27)\na19eef97dc862cae90fb2bcae9b425402a957a4f91f31d3825fb79ec1bf4dd27\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.699 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[308c9837-22ce-41ce-ac83-9a57a84cc2b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.700 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6186c2e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005466012 kernel: tapd6186c2e-30: left promiscuous mode
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.719 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4406c6c2-9cea-43e4-bb89-a2cf36e7b797]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.747 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[46d91e53-efbc-4491-a922-422e8101f141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.748 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a00d93-2924-4ad7-b40d-215ab7dde621]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.760 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[be0fe7c8-ad85-46dc-b2ea-3a26e4ec91fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649130, 'reachable_time': 17291, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246304, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.762 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:35:17 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:17.762 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[a6dd74fe-b054-427e-9697-c12faea84ba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:17 np0005466012 systemd[1]: run-netns-ovnmeta\x2dd6186c2e\x2d33cd\x2d4f99\x2d8140\x2dbddbba9d07d0.mount: Deactivated successfully.
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.987 2 DEBUG nova.compute.manager [req-8b8cf587-2e18-43ca-966e-5070103e8e7c req-cb390e34-f3cb-4376-8b97-d0a21175f066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-unplugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.988 2 DEBUG oslo_concurrency.lockutils [req-8b8cf587-2e18-43ca-966e-5070103e8e7c req-cb390e34-f3cb-4376-8b97-d0a21175f066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.988 2 DEBUG oslo_concurrency.lockutils [req-8b8cf587-2e18-43ca-966e-5070103e8e7c req-cb390e34-f3cb-4376-8b97-d0a21175f066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.988 2 DEBUG oslo_concurrency.lockutils [req-8b8cf587-2e18-43ca-966e-5070103e8e7c req-cb390e34-f3cb-4376-8b97-d0a21175f066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.988 2 DEBUG nova.compute.manager [req-8b8cf587-2e18-43ca-966e-5070103e8e7c req-cb390e34-f3cb-4376-8b97-d0a21175f066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-unplugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:17 np0005466012 nova_compute[192063]: 2025-10-02 12:35:17.989 2 WARNING nova.compute.manager [req-8b8cf587-2e18-43ca-966e-5070103e8e7c req-cb390e34-f3cb-4376-8b97-d0a21175f066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-unplugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state suspended and task_state None.#033[00m
Oct  2 08:35:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:19.289 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:19 np0005466012 nova_compute[192063]: 2025-10-02 12:35:19.969 2 INFO nova.compute.manager [None req-072a8d90-575b-440b-810c-9b5001e3fe09 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Get console output#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.086 2 DEBUG nova.compute.manager [req-fc70553b-6020-4cce-840e-96f643263a0f req-16e50095-96d2-4e80-8c99-ce0a097cc452 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.087 2 DEBUG oslo_concurrency.lockutils [req-fc70553b-6020-4cce-840e-96f643263a0f req-16e50095-96d2-4e80-8c99-ce0a097cc452 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.087 2 DEBUG oslo_concurrency.lockutils [req-fc70553b-6020-4cce-840e-96f643263a0f req-16e50095-96d2-4e80-8c99-ce0a097cc452 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.087 2 DEBUG oslo_concurrency.lockutils [req-fc70553b-6020-4cce-840e-96f643263a0f req-16e50095-96d2-4e80-8c99-ce0a097cc452 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.087 2 DEBUG nova.compute.manager [req-fc70553b-6020-4cce-840e-96f643263a0f req-16e50095-96d2-4e80-8c99-ce0a097cc452 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.087 2 WARNING nova.compute.manager [req-fc70553b-6020-4cce-840e-96f643263a0f req-16e50095-96d2-4e80-8c99-ce0a097cc452 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state suspended and task_state None.#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.311 2 INFO nova.compute.manager [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Resuming#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.311 2 DEBUG nova.objects.instance [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'flavor' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.371 2 DEBUG oslo_concurrency.lockutils [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.372 2 DEBUG oslo_concurrency.lockutils [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquired lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:20 np0005466012 nova_compute[192063]: 2025-10-02 12:35:20.372 2 DEBUG nova.network.neutron [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.518 2 DEBUG nova.network.neutron [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updating instance_info_cache with network_info: [{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.558 2 DEBUG oslo_concurrency.lockutils [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Releasing lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.562 2 DEBUG nova.virt.libvirt.vif [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508357503',display_name='tempest-TestNetworkAdvancedServerOps-server-1508357503',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508357503',id=155,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMh5YVc11xSdLclfb/KTn15iGTsatAtqMZ7/UObWZZ5Nty3g3yyO/+DJDP7MGesQP/RWjM47g+iXThApVJzS5WKw8zhlW4lZ6XTzYMT3T509KEtu4Fz7GVvmbxDEgScjfw==',key_name='tempest-TestNetworkAdvancedServerOps-1292664354',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:34:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-yyzzomqd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:35:17Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=58a588a8-3fb2-484e-82f6-7c72285d22de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.563 2 DEBUG nova.network.os_vif_util [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.563 2 DEBUG nova.network.os_vif_util [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.564 2 DEBUG os_vif [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.567 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82858700-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.568 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82858700-6e, col_values=(('external_ids', {'iface-id': '82858700-6e07-4f6e-b7ae-45a35721505d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:ce:fa', 'vm-uuid': '58a588a8-3fb2-484e-82f6-7c72285d22de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.568 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.568 2 INFO os_vif [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e')#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.586 2 DEBUG nova.objects.instance [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:21 np0005466012 kernel: tap82858700-6e: entered promiscuous mode
Oct  2 08:35:21 np0005466012 NetworkManager[51207]: <info>  [1759408521.6491] manager: (tap82858700-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Oct  2 08:35:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:21Z|00632|binding|INFO|Claiming lport 82858700-6e07-4f6e-b7ae-45a35721505d for this chassis.
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:21Z|00633|binding|INFO|82858700-6e07-4f6e-b7ae-45a35721505d: Claiming fa:16:3e:77:ce:fa 10.100.0.9
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.660 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:ce:fa 10.100.0.9'], port_security=['fa:16:3e:77:ce:fa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cb51de21-4073-4a91-9994-dd0124090f6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=485509cf-159a-4a14-9aa8-dc0cdb38eb16, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=82858700-6e07-4f6e-b7ae-45a35721505d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.662 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 82858700-6e07-4f6e-b7ae-45a35721505d in datapath d6186c2e-33cd-4f99-8140-bddbba9d07d0 bound to our chassis#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.663 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6186c2e-33cd-4f99-8140-bddbba9d07d0#033[00m
Oct  2 08:35:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:21Z|00634|binding|INFO|Setting lport 82858700-6e07-4f6e-b7ae-45a35721505d ovn-installed in OVS
Oct  2 08:35:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:21Z|00635|binding|INFO|Setting lport 82858700-6e07-4f6e-b7ae-45a35721505d up in Southbound
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.675 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6b5abd-e079-4e48-9733-152504515e21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.676 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6186c2e-31 in ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.677 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6186c2e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.678 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[04398a55-c2a4-4bfb-903c-0130e63c2eee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 systemd-udevd[246320]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.679 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd4c423-afaf-4063-987f-b4472f45bac8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 NetworkManager[51207]: <info>  [1759408521.6900] device (tap82858700-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:35:21 np0005466012 NetworkManager[51207]: <info>  [1759408521.6910] device (tap82858700-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.690 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[39098368-3c0b-42fe-9ef5-10e7904cc0b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 systemd-machined[152114]: New machine qemu-73-instance-0000009b.
Oct  2 08:35:21 np0005466012 systemd[1]: Started Virtual Machine qemu-73-instance-0000009b.
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.714 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[21d5cb24-f14f-405a-b2a8-4ae11e89cc55]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.741 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7cdb6165-16bb-43e1-bff2-70cf3a5c4bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 NetworkManager[51207]: <info>  [1759408521.7472] manager: (tapd6186c2e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/297)
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.745 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ea1cf0-fc33-4b28-a827-fbd5072a40d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 systemd-udevd[246325]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.774 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[95c7c313-6d4c-4418-896b-437d08e42bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.776 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[bf839ef3-0641-4e80-831a-9eede3af5373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 NetworkManager[51207]: <info>  [1759408521.7969] device (tapd6186c2e-30): carrier: link connected
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.801 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[ce37359a-3c9b-4025-991b-369024976ec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.818 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6a17b2f7-1a23-479f-a644-4b658edd6d10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6186c2e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:84:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651540, 'reachable_time': 27952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246354, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.835 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7a892f-5fcb-4eb8-a688-d8ca0bf9a7f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:841b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651540, 'tstamp': 651540}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246355, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.855 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a2f61b-d48a-40b3-a34e-1500e7a32ca0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6186c2e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:84:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651540, 'reachable_time': 27952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246358, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.885 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[1b280fb1-3803-46e6-b2cf-a4b44ef1b68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.958 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9605c7-c091-45c9-a720-03bd66521b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.959 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6186c2e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.959 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.960 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6186c2e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:21 np0005466012 NetworkManager[51207]: <info>  [1759408521.9626] manager: (tapd6186c2e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 kernel: tapd6186c2e-30: entered promiscuous mode
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.964 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6186c2e-30, col_values=(('external_ids', {'iface-id': '191808e4-38c4-4099-82ce-c40eaf416444'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:21Z|00636|binding|INFO|Releasing lport 191808e4-38c4-4099-82ce-c40eaf416444 from this chassis (sb_readonly=0)
Oct  2 08:35:21 np0005466012 nova_compute[192063]: 2025-10-02 12:35:21.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.984 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6186c2e-33cd-4f99-8140-bddbba9d07d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6186c2e-33cd-4f99-8140-bddbba9d07d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.985 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bf12ce90-9eb4-465b-8e56-a90b96b8cc5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.986 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-d6186c2e-33cd-4f99-8140-bddbba9d07d0
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/d6186c2e-33cd-4f99-8140-bddbba9d07d0.pid.haproxy
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID d6186c2e-33cd-4f99-8140-bddbba9d07d0
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:35:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:21.986 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'env', 'PROCESS_TAG=haproxy-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6186c2e-33cd-4f99-8140-bddbba9d07d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.229 2 DEBUG nova.compute.manager [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.229 2 DEBUG oslo_concurrency.lockutils [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.229 2 DEBUG oslo_concurrency.lockutils [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.229 2 DEBUG oslo_concurrency.lockutils [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.230 2 DEBUG nova.compute.manager [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.230 2 WARNING nova.compute.manager [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.230 2 DEBUG nova.compute.manager [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.230 2 DEBUG oslo_concurrency.lockutils [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.230 2 DEBUG oslo_concurrency.lockutils [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.230 2 DEBUG oslo_concurrency.lockutils [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.231 2 DEBUG nova.compute.manager [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.231 2 WARNING nova.compute.manager [req-d5826fb1-902c-4c8b-bf5f-9fa74f6b042f req-ca1b8d05-1adb-42b0-aff9-45ee34bd250c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state suspended and task_state resuming.#033[00m
Oct  2 08:35:22 np0005466012 podman[246395]: 2025-10-02 12:35:22.347138015 +0000 UTC m=+0.046334970 container create 9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:35:22 np0005466012 systemd[1]: Started libpod-conmon-9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e.scope.
Oct  2 08:35:22 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:35:22 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ab34cf7ff7945c1300804190e8020cff35970dcc6c374d4fa4e186e90fe5fb2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:35:22 np0005466012 podman[246395]: 2025-10-02 12:35:22.323954569 +0000 UTC m=+0.023151554 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:35:22 np0005466012 podman[246395]: 2025-10-02 12:35:22.429434214 +0000 UTC m=+0.128631179 container init 9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:35:22 np0005466012 podman[246395]: 2025-10-02 12:35:22.434634109 +0000 UTC m=+0.133831064 container start 9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:35:22 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [NOTICE]   (246414) : New worker (246416) forked
Oct  2 08:35:22 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [NOTICE]   (246414) : Loading success.
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.462 2 DEBUG nova.virt.libvirt.host [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Removed pending event for 58a588a8-3fb2-484e-82f6-7c72285d22de due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.463 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408522.462317, 58a588a8-3fb2-484e-82f6-7c72285d22de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.463 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Started (Lifecycle Event)#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.491 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.493 2 DEBUG nova.compute.manager [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.493 2 DEBUG nova.objects.instance [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.497 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.515 2 INFO nova.virt.libvirt.driver [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance running successfully.#033[00m
Oct  2 08:35:22 np0005466012 virtqemud[191783]: argument unsupported: QEMU guest agent is not configured
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.517 2 DEBUG nova.virt.libvirt.guest [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.517 2 DEBUG nova.compute.manager [None req-b2fc22b7-bfe5-4f24-9dae-23a6f118f9f6 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.519 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.519 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408522.4715517, 58a588a8-3fb2-484e-82f6-7c72285d22de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.519 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.543 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.545 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.580 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  2 08:35:22 np0005466012 nova_compute[192063]: 2025-10-02 12:35:22.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.339 2 INFO nova.compute.manager [None req-43c38031-6d51-43e4-9a5f-d2421e66a6ba 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Get console output#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.347 56 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.984 2 DEBUG nova.compute.manager [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-changed-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.984 2 DEBUG nova.compute.manager [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Refreshing instance network info cache due to event network-changed-82858700-6e07-4f6e-b7ae-45a35721505d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.985 2 DEBUG oslo_concurrency.lockutils [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.985 2 DEBUG oslo_concurrency.lockutils [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:24 np0005466012 nova_compute[192063]: 2025-10-02 12:35:24.985 2 DEBUG nova.network.neutron [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Refreshing network info cache for port 82858700-6e07-4f6e-b7ae-45a35721505d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.061 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.062 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.062 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.062 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.062 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.073 2 INFO nova.compute.manager [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Terminating instance#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.090 2 DEBUG nova.compute.manager [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:35:25 np0005466012 kernel: tap82858700-6e (unregistering): left promiscuous mode
Oct  2 08:35:25 np0005466012 NetworkManager[51207]: <info>  [1759408525.1109] device (tap82858700-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:25Z|00637|binding|INFO|Releasing lport 82858700-6e07-4f6e-b7ae-45a35721505d from this chassis (sb_readonly=0)
Oct  2 08:35:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:25Z|00638|binding|INFO|Setting lport 82858700-6e07-4f6e-b7ae-45a35721505d down in Southbound
Oct  2 08:35:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:25Z|00639|binding|INFO|Removing iface tap82858700-6e ovn-installed in OVS
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.149 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:ce:fa 10.100.0.9'], port_security=['fa:16:3e:77:ce:fa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '58a588a8-3fb2-484e-82f6-7c72285d22de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c7dd40d83e4e3ca71abbebf57921b6', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cb51de21-4073-4a91-9994-dd0124090f6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=485509cf-159a-4a14-9aa8-dc0cdb38eb16, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=82858700-6e07-4f6e-b7ae-45a35721505d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.150 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 82858700-6e07-4f6e-b7ae-45a35721505d in datapath d6186c2e-33cd-4f99-8140-bddbba9d07d0 unbound from our chassis#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.151 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6186c2e-33cd-4f99-8140-bddbba9d07d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.152 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ff8eb7-5f66-4ae7-910f-83398a21457e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.152 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 namespace which is not needed anymore#033[00m
Oct  2 08:35:25 np0005466012 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Oct  2 08:35:25 np0005466012 systemd-machined[152114]: Machine qemu-73-instance-0000009b terminated.
Oct  2 08:35:25 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [NOTICE]   (246414) : haproxy version is 2.8.14-c23fe91
Oct  2 08:35:25 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [NOTICE]   (246414) : path to executable is /usr/sbin/haproxy
Oct  2 08:35:25 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [WARNING]  (246414) : Exiting Master process...
Oct  2 08:35:25 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [ALERT]    (246414) : Current worker (246416) exited with code 143 (Terminated)
Oct  2 08:35:25 np0005466012 neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0[246410]: [WARNING]  (246414) : All workers exited. Exiting... (0)
Oct  2 08:35:25 np0005466012 systemd[1]: libpod-9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e.scope: Deactivated successfully.
Oct  2 08:35:25 np0005466012 podman[246448]: 2025-10-02 12:35:25.319771898 +0000 UTC m=+0.056627237 container died 9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:35:25 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:35:25 np0005466012 systemd[1]: var-lib-containers-storage-overlay-6ab34cf7ff7945c1300804190e8020cff35970dcc6c374d4fa4e186e90fe5fb2-merged.mount: Deactivated successfully.
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.364 2 INFO nova.virt.libvirt.driver [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Instance destroyed successfully.#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.365 2 DEBUG nova.objects.instance [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lazy-loading 'resources' on Instance uuid 58a588a8-3fb2-484e-82f6-7c72285d22de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:25 np0005466012 podman[246448]: 2025-10-02 12:35:25.367432884 +0000 UTC m=+0.104288213 container cleanup 9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:35:25 np0005466012 systemd[1]: libpod-conmon-9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e.scope: Deactivated successfully.
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.379 2 DEBUG nova.virt.libvirt.vif [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508357503',display_name='tempest-TestNetworkAdvancedServerOps-server-1508357503',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508357503',id=155,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMh5YVc11xSdLclfb/KTn15iGTsatAtqMZ7/UObWZZ5Nty3g3yyO/+DJDP7MGesQP/RWjM47g+iXThApVJzS5WKw8zhlW4lZ6XTzYMT3T509KEtu4Fz7GVvmbxDEgScjfw==',key_name='tempest-TestNetworkAdvancedServerOps-1292664354',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:34:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c7dd40d83e4e3ca71abbebf57921b6',ramdisk_id='',reservation_id='r-yyzzomqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-597114071',owner_user_name='tempest-TestNetworkAdvancedServerOps-597114071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:35:22Z,user_data=None,user_id='1faa7e121a0e43ad8cb4ae5b2cfcc6a2',uuid=58a588a8-3fb2-484e-82f6-7c72285d22de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.380 2 DEBUG nova.network.os_vif_util [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converting VIF {"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.380 2 DEBUG nova.network.os_vif_util [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.381 2 DEBUG os_vif [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82858700-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.388 2 INFO os_vif [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=82858700-6e07-4f6e-b7ae-45a35721505d,network=Network(d6186c2e-33cd-4f99-8140-bddbba9d07d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82858700-6e')#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.388 2 INFO nova.virt.libvirt.driver [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Deleting instance files /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de_del#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.389 2 INFO nova.virt.libvirt.driver [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Deletion of /var/lib/nova/instances/58a588a8-3fb2-484e-82f6-7c72285d22de_del complete#033[00m
Oct  2 08:35:25 np0005466012 podman[246495]: 2025-10-02 12:35:25.435788995 +0000 UTC m=+0.043922763 container remove 9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.443 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b326bd-a7dc-4b53-9942-1a987a686434]: (4, ('Thu Oct  2 12:35:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 (9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e)\n9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e\nThu Oct  2 12:35:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 (9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e)\n9b5e04fc0c8ddb0d89e90d4c15817cc963e6cce0b38495ff40e8d0366903508e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.445 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[35186e13-3254-4a5a-97cb-46ab7c5aa38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.446 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6186c2e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:25 np0005466012 kernel: tapd6186c2e-30: left promiscuous mode
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.453 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a464197b-ad0a-45ff-8eac-a8c16f905f55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.481 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8cad52d6-e9d8-4046-993e-0226598e00b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.482 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ec36bd-85f6-40b6-be94-342e095bc5d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.494 2 INFO nova.compute.manager [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.495 2 DEBUG oslo.service.loopingcall [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.496 2 DEBUG nova.compute.manager [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:35:25 np0005466012 nova_compute[192063]: 2025-10-02 12:35:25.496 2 DEBUG nova.network.neutron [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.508 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7963fe0e-7a0a-40bc-bec1-62fc5fd66291]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651534, 'reachable_time': 29747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246510, 'error': None, 'target': 'ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.510 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6186c2e-33cd-4f99-8140-bddbba9d07d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:35:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:25.510 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[f37a512e-233d-49a9-b015-dbabd387c694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:25 np0005466012 systemd[1]: run-netns-ovnmeta\x2dd6186c2e\x2d33cd\x2d4f99\x2d8140\x2dbddbba9d07d0.mount: Deactivated successfully.
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.181 2 DEBUG nova.network.neutron [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.209 2 INFO nova.compute.manager [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Took 0.71 seconds to deallocate network for instance.#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.309 2 DEBUG nova.compute.manager [req-fc0a832e-756d-4aea-b94f-51bc531a75fd req-27fbf746-f2fe-4a6f-88e9-78e4bbdaa793 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-deleted-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.320 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.321 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.410 2 DEBUG nova.compute.provider_tree [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.415 2 DEBUG nova.network.neutron [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updated VIF entry in instance network info cache for port 82858700-6e07-4f6e-b7ae-45a35721505d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.416 2 DEBUG nova.network.neutron [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Updating instance_info_cache with network_info: [{"id": "82858700-6e07-4f6e-b7ae-45a35721505d", "address": "fa:16:3e:77:ce:fa", "network": {"id": "d6186c2e-33cd-4f99-8140-bddbba9d07d0", "bridge": "br-int", "label": "tempest-network-smoke--1706170151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c7dd40d83e4e3ca71abbebf57921b6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82858700-6e", "ovs_interfaceid": "82858700-6e07-4f6e-b7ae-45a35721505d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.456 2 DEBUG nova.scheduler.client.report [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.464 2 DEBUG oslo_concurrency.lockutils [req-cf5b8024-4d99-46ac-a885-84405eba75bb req-7be77851-5f61-45b5-b49b-6e6a97b204c0 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-58a588a8-3fb2-484e-82f6-7c72285d22de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.493 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.498 2 DEBUG nova.compute.manager [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-unplugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.498 2 DEBUG oslo_concurrency.lockutils [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.498 2 DEBUG oslo_concurrency.lockutils [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.498 2 DEBUG oslo_concurrency.lockutils [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.499 2 DEBUG nova.compute.manager [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-unplugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.499 2 WARNING nova.compute.manager [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-unplugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.499 2 DEBUG nova.compute.manager [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.499 2 DEBUG oslo_concurrency.lockutils [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.499 2 DEBUG oslo_concurrency.lockutils [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.500 2 DEBUG oslo_concurrency.lockutils [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.500 2 DEBUG nova.compute.manager [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] No waiting events found dispatching network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.500 2 WARNING nova.compute.manager [req-8e3e4a8b-182a-4eae-8282-22786296bae1 req-91cf6e10-14fc-4ce1-9598-247a527c7b50 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Received unexpected event network-vif-plugged-82858700-6e07-4f6e-b7ae-45a35721505d for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.518 2 INFO nova.scheduler.client.report [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Deleted allocations for instance 58a588a8-3fb2-484e-82f6-7c72285d22de#033[00m
Oct  2 08:35:26 np0005466012 nova_compute[192063]: 2025-10-02 12:35:26.613 2 DEBUG oslo_concurrency.lockutils [None req-fbbb01ab-29dd-470c-b3f6-e07a83b83b9e 1faa7e121a0e43ad8cb4ae5b2cfcc6a2 76c7dd40d83e4e3ca71abbebf57921b6 - - default default] Lock "58a588a8-3fb2-484e-82f6-7c72285d22de" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:27 np0005466012 nova_compute[192063]: 2025-10-02 12:35:27.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:28 np0005466012 podman[246512]: 2025-10-02 12:35:28.154572415 +0000 UTC m=+0.057971694 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Oct  2 08:35:28 np0005466012 podman[246511]: 2025-10-02 12:35:28.155485371 +0000 UTC m=+0.061408720 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:35:28 np0005466012 nova_compute[192063]: 2025-10-02 12:35:28.264 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:28 np0005466012 nova_compute[192063]: 2025-10-02 12:35:28.264 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:28 np0005466012 nova_compute[192063]: 2025-10-02 12:35:28.264 2 INFO nova.compute.manager [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Shelving#033[00m
Oct  2 08:35:28 np0005466012 nova_compute[192063]: 2025-10-02 12:35:28.312 2 DEBUG nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  2 08:35:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:29Z|00640|binding|INFO|Releasing lport 765813dd-4eb1-46b7-adc3-4b198fc4dbfb from this chassis (sb_readonly=0)
Oct  2 08:35:29 np0005466012 nova_compute[192063]: 2025-10-02 12:35:29.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 nova_compute[192063]: 2025-10-02 12:35:30.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 kernel: tap37bcb93a-86 (unregistering): left promiscuous mode
Oct  2 08:35:30 np0005466012 NetworkManager[51207]: <info>  [1759408530.4605] device (tap37bcb93a-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:35:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:30Z|00641|binding|INFO|Releasing lport 37bcb93a-8639-42b7-aafd-21f019307d66 from this chassis (sb_readonly=0)
Oct  2 08:35:30 np0005466012 nova_compute[192063]: 2025-10-02 12:35:30.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:30Z|00642|binding|INFO|Setting lport 37bcb93a-8639-42b7-aafd-21f019307d66 down in Southbound
Oct  2 08:35:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:35:30Z|00643|binding|INFO|Removing iface tap37bcb93a-86 ovn-installed in OVS
Oct  2 08:35:30 np0005466012 nova_compute[192063]: 2025-10-02 12:35:30.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.476 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:d9:00 10.100.0.9'], port_security=['fa:16:3e:3b:d9:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d931a6f-0703-4e1f-acfc-b8402834c14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1f1562c-389b-4488-b13e-0f3594ca916b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '086ee425cb0949ab836e1b3ae489ced0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e9c6c044-9bae-451d-9ac4-f29a1af96360', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23273790-9180-40d0-a3ca-fdfdfd7f3c59, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=37bcb93a-8639-42b7-aafd-21f019307d66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.477 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcb93a-8639-42b7-aafd-21f019307d66 in datapath a1f1562c-389b-4488-b13e-0f3594ca916b unbound from our chassis#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.478 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1f1562c-389b-4488-b13e-0f3594ca916b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.479 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b3fc1f-b4d9-4fe6-bcf2-ef68a11d5249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.480 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b namespace which is not needed anymore#033[00m
Oct  2 08:35:30 np0005466012 nova_compute[192063]: 2025-10-02 12:35:30.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Oct  2 08:35:30 np0005466012 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009c.scope: Consumed 14.074s CPU time.
Oct  2 08:35:30 np0005466012 systemd-machined[152114]: Machine qemu-72-instance-0000009c terminated.
Oct  2 08:35:30 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [NOTICE]   (246088) : haproxy version is 2.8.14-c23fe91
Oct  2 08:35:30 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [NOTICE]   (246088) : path to executable is /usr/sbin/haproxy
Oct  2 08:35:30 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [WARNING]  (246088) : Exiting Master process...
Oct  2 08:35:30 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [ALERT]    (246088) : Current worker (246090) exited with code 143 (Terminated)
Oct  2 08:35:30 np0005466012 neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b[246084]: [WARNING]  (246088) : All workers exited. Exiting... (0)
Oct  2 08:35:30 np0005466012 systemd[1]: libpod-37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42.scope: Deactivated successfully.
Oct  2 08:35:30 np0005466012 podman[246575]: 2025-10-02 12:35:30.606879041 +0000 UTC m=+0.041846265 container died 37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  2 08:35:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42-userdata-shm.mount: Deactivated successfully.
Oct  2 08:35:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-3276e65bb91dcc9d2d4cddb8128fe787645d491b4eec5777b1ef9e89f989e456-merged.mount: Deactivated successfully.
Oct  2 08:35:30 np0005466012 podman[246575]: 2025-10-02 12:35:30.650880706 +0000 UTC m=+0.085847930 container cleanup 37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:35:30 np0005466012 systemd[1]: libpod-conmon-37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42.scope: Deactivated successfully.
Oct  2 08:35:30 np0005466012 podman[246604]: 2025-10-02 12:35:30.706624517 +0000 UTC m=+0.036489196 container remove 37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.712 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9805490e-55a1-4ff5-baaf-5661104df96c]: (4, ('Thu Oct  2 12:35:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b (37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42)\n37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42\nThu Oct  2 12:35:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b (37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42)\n37ef4b0ac20e010b131189900c372ff9563ecc9d42c78e242e54b5105a108c42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.713 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[35d01c48-ed9f-4fb3-8841-cd3d36327639]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.714 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1f1562c-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:30 np0005466012 nova_compute[192063]: 2025-10-02 12:35:30.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 kernel: tapa1f1562c-30: left promiscuous mode
Oct  2 08:35:30 np0005466012 nova_compute[192063]: 2025-10-02 12:35:30.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.736 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ea26f9-cfc5-47d1-8e6c-a3094b23c61b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.758 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d27033c6-ae63-4fdb-926c-634fc9d92b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.759 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ca690840-37f0-4fb2-b905-3a7e5a6e2c82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.775 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[064408ab-1738-4b3a-9ebb-ea8437878e62]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649641, 'reachable_time': 37081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246639, 'error': None, 'target': 'ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.777 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1f1562c-389b-4488-b13e-0f3594ca916b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:35:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:35:30.777 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[bebde35d-3cdb-4a13-b22f-e5185f95db13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:35:30 np0005466012 systemd[1]: run-netns-ovnmeta\x2da1f1562c\x2d389b\x2d4488\x2db13e\x2d0f3594ca916b.mount: Deactivated successfully.
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.211 2 DEBUG nova.compute.manager [req-edfa986c-4236-4f43-a83e-03ea1a8ee7e4 req-4ef14eab-28c1-4706-aceb-9a871a58201c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-unplugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.211 2 DEBUG oslo_concurrency.lockutils [req-edfa986c-4236-4f43-a83e-03ea1a8ee7e4 req-4ef14eab-28c1-4706-aceb-9a871a58201c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.211 2 DEBUG oslo_concurrency.lockutils [req-edfa986c-4236-4f43-a83e-03ea1a8ee7e4 req-4ef14eab-28c1-4706-aceb-9a871a58201c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.211 2 DEBUG oslo_concurrency.lockutils [req-edfa986c-4236-4f43-a83e-03ea1a8ee7e4 req-4ef14eab-28c1-4706-aceb-9a871a58201c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.211 2 DEBUG nova.compute.manager [req-edfa986c-4236-4f43-a83e-03ea1a8ee7e4 req-4ef14eab-28c1-4706-aceb-9a871a58201c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] No waiting events found dispatching network-vif-unplugged-37bcb93a-8639-42b7-aafd-21f019307d66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.212 2 WARNING nova.compute.manager [req-edfa986c-4236-4f43-a83e-03ea1a8ee7e4 req-4ef14eab-28c1-4706-aceb-9a871a58201c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received unexpected event network-vif-unplugged-37bcb93a-8639-42b7-aafd-21f019307d66 for instance with vm_state active and task_state shelving.#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.328 2 INFO nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance shutdown successfully after 3 seconds.#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.333 2 INFO nova.virt.libvirt.driver [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance destroyed successfully.#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.334 2 DEBUG nova.objects.instance [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.660 2 INFO nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Beginning cold snapshot process#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.858 2 DEBUG nova.privsep.utils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:35:31 np0005466012 nova_compute[192063]: 2025-10-02 12:35:31.859 2 DEBUG oslo_concurrency.processutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk /var/lib/nova/instances/snapshots/tmpo274v9q1/adbb3643d86c41c68418c419d3e7da37 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:32 np0005466012 podman[246650]: 2025-10-02 12:35:32.167107449 +0000 UTC m=+0.071764318 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 08:35:32 np0005466012 podman[246651]: 2025-10-02 12:35:32.186978282 +0000 UTC m=+0.091762044 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:35:32 np0005466012 nova_compute[192063]: 2025-10-02 12:35:32.240 2 DEBUG oslo_concurrency.processutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d/disk /var/lib/nova/instances/snapshots/tmpo274v9q1/adbb3643d86c41c68418c419d3e7da37" returned: 0 in 0.382s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:32 np0005466012 nova_compute[192063]: 2025-10-02 12:35:32.242 2 INFO nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:35:32 np0005466012 nova_compute[192063]: 2025-10-02 12:35:32.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.348 2 DEBUG nova.compute.manager [req-074448c7-35d2-4f9a-a58e-02506f44f779 req-0805c932-3dd7-499b-bf85-aaa1df962880 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.349 2 DEBUG oslo_concurrency.lockutils [req-074448c7-35d2-4f9a-a58e-02506f44f779 req-0805c932-3dd7-499b-bf85-aaa1df962880 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.350 2 DEBUG oslo_concurrency.lockutils [req-074448c7-35d2-4f9a-a58e-02506f44f779 req-0805c932-3dd7-499b-bf85-aaa1df962880 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.350 2 DEBUG oslo_concurrency.lockutils [req-074448c7-35d2-4f9a-a58e-02506f44f779 req-0805c932-3dd7-499b-bf85-aaa1df962880 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.350 2 DEBUG nova.compute.manager [req-074448c7-35d2-4f9a-a58e-02506f44f779 req-0805c932-3dd7-499b-bf85-aaa1df962880 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] No waiting events found dispatching network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.351 2 WARNING nova.compute.manager [req-074448c7-35d2-4f9a-a58e-02506f44f779 req-0805c932-3dd7-499b-bf85-aaa1df962880 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received unexpected event network-vif-plugged-37bcb93a-8639-42b7-aafd-21f019307d66 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Oct  2 08:35:33 np0005466012 nova_compute[192063]: 2025-10-02 12:35:33.832 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:34 np0005466012 nova_compute[192063]: 2025-10-02 12:35:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.030 2 INFO nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Snapshot image upload complete#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.031 2 DEBUG nova.compute.manager [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.127 2 INFO nova.compute.manager [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Shelve offloading#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.147 2 INFO nova.virt.libvirt.driver [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance destroyed successfully.#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.148 2 DEBUG nova.compute.manager [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.150 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.150 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.150 2 DEBUG nova.network.neutron [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:35 np0005466012 nova_compute[192063]: 2025-10-02 12:35:35.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:37 np0005466012 nova_compute[192063]: 2025-10-02 12:35:37.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:37 np0005466012 nova_compute[192063]: 2025-10-02 12:35:37.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:38 np0005466012 nova_compute[192063]: 2025-10-02 12:35:38.616 2 DEBUG nova.network.neutron [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:38 np0005466012 nova_compute[192063]: 2025-10-02 12:35:38.670 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:40 np0005466012 nova_compute[192063]: 2025-10-02 12:35:40.362 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408525.360792, 58a588a8-3fb2-484e-82f6-7c72285d22de => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:40 np0005466012 nova_compute[192063]: 2025-10-02 12:35:40.363 2 INFO nova.compute.manager [-] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:35:40 np0005466012 nova_compute[192063]: 2025-10-02 12:35:40.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:40 np0005466012 nova_compute[192063]: 2025-10-02 12:35:40.405 2 DEBUG nova.compute.manager [None req-4abe5ba1-11ba-47ce-b6f8-ae98a0fc824a - - - - - -] [instance: 58a588a8-3fb2-484e-82f6-7c72285d22de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.046 2 INFO nova.virt.libvirt.driver [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Instance destroyed successfully.#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.047 2 DEBUG nova.objects.instance [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lazy-loading 'resources' on Instance uuid 1d931a6f-0703-4e1f-acfc-b8402834c14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.064 2 DEBUG nova.virt.libvirt.vif [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1978368192',display_name='tempest-TestShelveInstance-server-1978368192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1978368192',id=156,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/XDGeJT9WLDU0HFLzjGkDPYXYzNnVTkBgasq3A2D12H9O5maW7G09qXMMNOwpxQcY9ezmdK5YuMVeh5Lmhul8cAhXsU4OmdH86TOpc/q67Xul+dL/ucyqS3TKQHf5rEA==',key_name='tempest-TestShelveInstance-537086882',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:35:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='086ee425cb0949ab836e1b3ae489ced0',ramdisk_id='',reservation_id='r-a0s1hayn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1329865483',owner_user_name='tempest-TestShelveInstance-1329865483-project-member',shelved_at='2025-10-02T12:35:35.031233',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='5afe2b05-02a1-45ef-8376-c5d63a6eac1b'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:35:32Z,user_data=None,user_id='81e456ca7bee486181b9c11ddb1f3ffd',uuid=1d931a6f-0703-4e1f-acfc-b8402834c14d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.065 2 DEBUG nova.network.os_vif_util [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converting VIF {"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": "br-int", "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcb93a-86", "ovs_interfaceid": "37bcb93a-8639-42b7-aafd-21f019307d66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.065 2 DEBUG nova.network.os_vif_util [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.066 2 DEBUG os_vif [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37bcb93a-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.134 2 INFO os_vif [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:d9:00,bridge_name='br-int',has_traffic_filtering=True,id=37bcb93a-8639-42b7-aafd-21f019307d66,network=Network(a1f1562c-389b-4488-b13e-0f3594ca916b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcb93a-86')#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.135 2 INFO nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Deleting instance files /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d_del#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.140 2 INFO nova.virt.libvirt.driver [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Deletion of /var/lib/nova/instances/1d931a6f-0703-4e1f-acfc-b8402834c14d_del complete#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.300 2 INFO nova.scheduler.client.report [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Deleted allocations for instance 1d931a6f-0703-4e1f-acfc-b8402834c14d#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.357 2 DEBUG nova.compute.manager [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Received event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.358 2 DEBUG nova.compute.manager [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing instance network info cache due to event network-changed-37bcb93a-8639-42b7-aafd-21f019307d66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.358 2 DEBUG oslo_concurrency.lockutils [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.358 2 DEBUG oslo_concurrency.lockutils [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.358 2 DEBUG nova.network.neutron [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Refreshing network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.405 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.405 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.448 2 DEBUG nova.compute.provider_tree [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.466 2 DEBUG nova.scheduler.client.report [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.511 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.603 2 DEBUG oslo_concurrency.lockutils [None req-7f81f071-7317-4e24-aeeb-14f4116a09fb 81e456ca7bee486181b9c11ddb1f3ffd 086ee425cb0949ab836e1b3ae489ced0 - - default default] Lock "1d931a6f-0703-4e1f-acfc-b8402834c14d" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 13.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:41 np0005466012 nova_compute[192063]: 2025-10-02 12:35:41.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:42 np0005466012 nova_compute[192063]: 2025-10-02 12:35:42.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.863 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.864 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.864 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.864 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.963 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.965 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:44 np0005466012 nova_compute[192063]: 2025-10-02 12:35:44.980 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.038 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.039 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5676MB free_disk=73.24301528930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.040 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.040 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.266 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance eca0cfe5-4225-401d-b0b5-04244de9913a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.266 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.266 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.280 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.362 2 DEBUG nova.network.neutron [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updated VIF entry in instance network info cache for port 37bcb93a-8639-42b7-aafd-21f019307d66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.364 2 DEBUG nova.network.neutron [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Updating instance_info_cache with network_info: [{"id": "37bcb93a-8639-42b7-aafd-21f019307d66", "address": "fa:16:3e:3b:d9:00", "network": {"id": "a1f1562c-389b-4488-b13e-0f3594ca916b", "bridge": null, "label": "tempest-TestShelveInstance-85908978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "086ee425cb0949ab836e1b3ae489ced0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap37bcb93a-86", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.388 2 DEBUG oslo_concurrency.lockutils [req-d65e9129-4683-445d-b63b-19769638a234 req-89e4239a-0be1-4c55-86bf-71d1e93509af 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-1d931a6f-0703-4e1f-acfc-b8402834c14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.430 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.446 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.476 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.476 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.477 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.492 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.492 2 INFO nova.compute.claims [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.668 2 DEBUG nova.compute.provider_tree [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.704 2 DEBUG nova.scheduler.client.report [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.737 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.738 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.744 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408530.742961, 1d931a6f-0703-4e1f-acfc-b8402834c14d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.744 2 INFO nova.compute.manager [-] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.793 2 DEBUG nova.compute.manager [None req-f95dedee-e223-4a33-baf4-8c55ea067530 - - - - - -] [instance: 1d931a6f-0703-4e1f-acfc-b8402834c14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.827 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.828 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.866 2 INFO nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:35:45 np0005466012 nova_compute[192063]: 2025-10-02 12:35:45.890 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.049 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.050 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.051 2 INFO nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Creating image(s)#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.051 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.052 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.052 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.064 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.125 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.125 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.126 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.137 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:46 np0005466012 podman[246695]: 2025-10-02 12:35:46.173787795 +0000 UTC m=+0.080749968 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:35:46 np0005466012 podman[246697]: 2025-10-02 12:35:46.193999297 +0000 UTC m=+0.100141527 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.195 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.196 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.227 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.228 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.228 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.282 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.284 2 DEBUG nova.virt.disk.api [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.284 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.339 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.341 2 DEBUG nova.virt.disk.api [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.341 2 DEBUG nova.objects.instance [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid eca0cfe5-4225-401d-b0b5-04244de9913a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.363 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.364 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Ensure instance console log exists: /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.365 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.365 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.366 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.477 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.478 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.837 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.838 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:35:46 np0005466012 nova_compute[192063]: 2025-10-02 12:35:46.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:35:47 np0005466012 nova_compute[192063]: 2025-10-02 12:35:47.055 2 DEBUG nova.policy [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:35:47 np0005466012 nova_compute[192063]: 2025-10-02 12:35:47.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:48 np0005466012 podman[246763]: 2025-10-02 12:35:48.132616182 +0000 UTC m=+0.045309501 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:35:48 np0005466012 podman[246762]: 2025-10-02 12:35:48.140668266 +0000 UTC m=+0.057284105 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:35:48 np0005466012 nova_compute[192063]: 2025-10-02 12:35:48.924 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Successfully created port: 10442e43-8bbf-4027-a5d1-25e81de68240 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:35:49 np0005466012 nova_compute[192063]: 2025-10-02 12:35:49.997 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Successfully created port: 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:35:51 np0005466012 nova_compute[192063]: 2025-10-02 12:35:51.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:52 np0005466012 nova_compute[192063]: 2025-10-02 12:35:52.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:52 np0005466012 nova_compute[192063]: 2025-10-02 12:35:52.986 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Successfully updated port: 10442e43-8bbf-4027-a5d1-25e81de68240 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.529 2 DEBUG nova.compute.manager [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-changed-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.530 2 DEBUG nova.compute.manager [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing instance network info cache due to event network-changed-10442e43-8bbf-4027-a5d1-25e81de68240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.530 2 DEBUG oslo_concurrency.lockutils [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.530 2 DEBUG oslo_concurrency.lockutils [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.530 2 DEBUG nova.network.neutron [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing network info cache for port 10442e43-8bbf-4027-a5d1-25e81de68240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.818 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Successfully updated port: 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.821 2 DEBUG nova.network.neutron [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:35:54 np0005466012 nova_compute[192063]: 2025-10-02 12:35:54.830 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:55 np0005466012 nova_compute[192063]: 2025-10-02 12:35:55.973 2 DEBUG nova.network.neutron [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.006 2 DEBUG oslo_concurrency.lockutils [req-13704566-ee92-4a03-b0a6-2f5ffae77a08 req-58fa9426-46d5-4dc2-92f3-c5f25260a1f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.007 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.007 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.673 2 DEBUG nova.compute.manager [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-changed-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.674 2 DEBUG nova.compute.manager [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing instance network info cache due to event network-changed-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:35:56 np0005466012 nova_compute[192063]: 2025-10-02 12:35:56.675 2 DEBUG oslo_concurrency.lockutils [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:35:57 np0005466012 nova_compute[192063]: 2025-10-02 12:35:57.016 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:35:57 np0005466012 nova_compute[192063]: 2025-10-02 12:35:57.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:35:59 np0005466012 podman[246799]: 2025-10-02 12:35:59.141922126 +0000 UTC m=+0.052413860 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:35:59 np0005466012 podman[246798]: 2025-10-02 12:35:59.142260665 +0000 UTC m=+0.054990131 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:36:01 np0005466012 nova_compute[192063]: 2025-10-02 12:36:01.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:02.152 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:02.153 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:02.153 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.494 2 DEBUG nova.network.neutron [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.513 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.514 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Instance network_info: |[{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.515 2 DEBUG oslo_concurrency.lockutils [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.515 2 DEBUG nova.network.neutron [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing network info cache for port 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.520 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Start _get_guest_xml network_info=[{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.525 2 WARNING nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.534 2 DEBUG nova.virt.libvirt.host [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.535 2 DEBUG nova.virt.libvirt.host [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.544 2 DEBUG nova.virt.libvirt.host [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.545 2 DEBUG nova.virt.libvirt.host [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.545 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.546 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.546 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.547 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.547 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.547 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.547 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.547 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.548 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.548 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.548 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.548 2 DEBUG nova.virt.hardware [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.551 2 DEBUG nova.virt.libvirt.vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:35:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-452628804',display_name='tempest-TestGettingAddress-server-452628804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-452628804',id=158,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-sq6akkeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:45Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=eca0cfe5-4225-401d-b0b5-04244de9913a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.552 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.552 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.553 2 DEBUG nova.virt.libvirt.vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:35:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-452628804',display_name='tempest-TestGettingAddress-server-452628804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-452628804',id=158,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-sq6akkeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:45Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=eca0cfe5-4225-401d-b0b5-04244de9913a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.553 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.554 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.554 2 DEBUG nova.objects.instance [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid eca0cfe5-4225-401d-b0b5-04244de9913a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.583 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <uuid>eca0cfe5-4225-401d-b0b5-04244de9913a</uuid>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <name>instance-0000009e</name>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestGettingAddress-server-452628804</nova:name>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:36:02</nova:creationTime>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:port uuid="10442e43-8bbf-4027-a5d1-25e81de68240">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        <nova:port uuid="5ccc67ed-7e13-40f0-81ad-953d3b80cb0a">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe48:2dee" ipVersion="6"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <entry name="serial">eca0cfe5-4225-401d-b0b5-04244de9913a</entry>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <entry name="uuid">eca0cfe5-4225-401d-b0b5-04244de9913a</entry>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.config"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:6d:1c:88"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <target dev="tap10442e43-8b"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:48:2d:ee"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <target dev="tap5ccc67ed-7e"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/console.log" append="off"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:36:02 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:36:02 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:36:02 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:36:02 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.584 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Preparing to wait for external event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.585 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.585 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.586 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.586 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Preparing to wait for external event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.586 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.586 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.587 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.589 2 DEBUG nova.virt.libvirt.vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:35:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-452628804',display_name='tempest-TestGettingAddress-server-452628804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-452628804',id=158,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-sq6akkeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:45Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=eca0cfe5-4225-401d-b0b5-04244de9913a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.590 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.590 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.591 2 DEBUG os_vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.592 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.592 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.595 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10442e43-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.596 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10442e43-8b, col_values=(('external_ids', {'iface-id': '10442e43-8bbf-4027-a5d1-25e81de68240', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:1c:88', 'vm-uuid': 'eca0cfe5-4225-401d-b0b5-04244de9913a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 NetworkManager[51207]: <info>  [1759408562.5984] manager: (tap10442e43-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.603 2 INFO os_vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b')#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.604 2 DEBUG nova.virt.libvirt.vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:35:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-452628804',display_name='tempest-TestGettingAddress-server-452628804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-452628804',id=158,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-sq6akkeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:35:45Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=eca0cfe5-4225-401d-b0b5-04244de9913a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.605 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.606 2 DEBUG nova.network.os_vif_util [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.606 2 DEBUG os_vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.607 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.608 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.610 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5ccc67ed-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.610 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5ccc67ed-7e, col_values=(('external_ids', {'iface-id': '5ccc67ed-7e13-40f0-81ad-953d3b80cb0a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:2d:ee', 'vm-uuid': 'eca0cfe5-4225-401d-b0b5-04244de9913a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 NetworkManager[51207]: <info>  [1759408562.6123] manager: (tap5ccc67ed-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.618 2 INFO os_vif [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e')#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.704 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.704 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.704 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:6d:1c:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.704 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:48:2d:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.705 2 INFO nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Using config drive#033[00m
Oct  2 08:36:02 np0005466012 nova_compute[192063]: 2025-10-02 12:36:02.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 podman[246844]: 2025-10-02 12:36:03.143371501 +0000 UTC m=+0.060515864 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:36:03 np0005466012 podman[246843]: 2025-10-02 12:36:03.181646506 +0000 UTC m=+0.099592472 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.255 2 INFO nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Creating config drive at /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.config#033[00m
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.262 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmtjmoevu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.403 2 DEBUG oslo_concurrency.processutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmtjmoevu" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:36:03 np0005466012 kernel: tap10442e43-8b: entered promiscuous mode
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.4617] manager: (tap10442e43-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00644|binding|INFO|Claiming lport 10442e43-8bbf-4027-a5d1-25e81de68240 for this chassis.
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00645|binding|INFO|10442e43-8bbf-4027-a5d1-25e81de68240: Claiming fa:16:3e:6d:1c:88 10.100.0.13
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.477 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:1c:88 10.100.0.13'], port_security=['fa:16:3e:6d:1c:88 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'eca0cfe5-4225-401d-b0b5-04244de9913a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4660930-bac7-4d92-b95e-2296da9c1763, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=10442e43-8bbf-4027-a5d1-25e81de68240) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.4790] manager: (tap5ccc67ed-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.479 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 10442e43-8bbf-4027-a5d1-25e81de68240 in datapath 385e0a9e-c250-418d-8cab-e7e3ae4506c1 bound to our chassis#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.480 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e0a9e-c250-418d-8cab-e7e3ae4506c1#033[00m
Oct  2 08:36:03 np0005466012 kernel: tap5ccc67ed-7e: entered promiscuous mode
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00646|binding|INFO|Setting lport 10442e43-8bbf-4027-a5d1-25e81de68240 ovn-installed in OVS
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00647|binding|INFO|Setting lport 10442e43-8bbf-4027-a5d1-25e81de68240 up in Southbound
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00648|if_status|INFO|Not updating pb chassis for 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a now as sb is readonly
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 systemd-udevd[246904]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00649|binding|INFO|Claiming lport 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a for this chassis.
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00650|binding|INFO|5ccc67ed-7e13-40f0-81ad-953d3b80cb0a: Claiming fa:16:3e:48:2d:ee 2001:db8::f816:3eff:fe48:2dee
Oct  2 08:36:03 np0005466012 systemd-udevd[246906]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.493 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aeaa7ed4-ed86-4886-862f-2c28c436575a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.494 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap385e0a9e-c1 in ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00651|binding|INFO|Setting lport 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a ovn-installed in OVS
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.498 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap385e0a9e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.498 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[817821e3-bfb9-4a83-bf8c-5f3b5ef44898]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.500 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3f3ce4-c485-4cc3-a259-05e1e80552e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00652|binding|INFO|Setting lport 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a up in Southbound
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.503 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:2d:ee 2001:db8::f816:3eff:fe48:2dee'], port_security=['fa:16:3e:48:2d:ee 2001:db8::f816:3eff:fe48:2dee'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe48:2dee/64', 'neutron:device_id': 'eca0cfe5-4225-401d-b0b5-04244de9913a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4522b631-3a21-451f-8605-7c2b34273ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09de0cb2-1ed0-42b1-8efb-533f84345cc8, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.5107] device (tap5ccc67ed-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.5116] device (tap5ccc67ed-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.5121] device (tap10442e43-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.5129] device (tap10442e43-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.512 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[d86f23ff-2326-41b5-80b4-4b2e378bbbd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 systemd-machined[152114]: New machine qemu-74-instance-0000009e.
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.527 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b61a6846-eb4f-4957-a9c3-e8e7aa9c4500]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 systemd[1]: Started Virtual Machine qemu-74-instance-0000009e.
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.557 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3e650924-ef16-4558-86c6-7ac6e4e5c3e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.562 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[678112dd-a233-45af-ace1-ede73096185b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.5643] manager: (tap385e0a9e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.598 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c05cb8df-e87a-4700-b2e1-bdf94b664294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.602 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[67ce8942-ea56-4b4c-bc22-47db7e68f865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.6286] device (tap385e0a9e-c0): carrier: link connected
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.637 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfb742e-68e1-4190-a21a-ef7c1fb3ad6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.653 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2aa883-2675-4e74-b536-e532f833399a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e0a9e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f8:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655724, 'reachable_time': 34139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246940, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.671 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[89947fa1-968d-43bb-980a-44c8aba92a08]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:f8af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655724, 'tstamp': 655724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246941, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.692 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f28483-6954-47ef-8718-1ed656437857]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e0a9e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f8:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655724, 'reachable_time': 34139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246942, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.720 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebde8ec-b5e4-462e-a2b3-f7aa61c4acd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.769 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddd1947-e91d-4c0e-b432-99ddfd8996de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.770 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e0a9e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.771 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.772 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e0a9e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 NetworkManager[51207]: <info>  [1759408563.7759] manager: (tap385e0a9e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Oct  2 08:36:03 np0005466012 kernel: tap385e0a9e-c0: entered promiscuous mode
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.778 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e0a9e-c0, col_values=(('external_ids', {'iface-id': '9e6245a0-6013-48e5-9e96-a79fafe59b6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:03Z|00653|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:36:03 np0005466012 nova_compute[192063]: 2025-10-02 12:36:03.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.800 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/385e0a9e-c250-418d-8cab-e7e3ae4506c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/385e0a9e-c250-418d-8cab-e7e3ae4506c1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.801 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f6d8be9f-7975-4b30-9fef-9e0544fe6c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.802 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-385e0a9e-c250-418d-8cab-e7e3ae4506c1
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/385e0a9e-c250-418d-8cab-e7e3ae4506c1.pid.haproxy
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 385e0a9e-c250-418d-8cab-e7e3ae4506c1
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:36:03 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:03.803 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'env', 'PROCESS_TAG=haproxy-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/385e0a9e-c250-418d-8cab-e7e3ae4506c1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:36:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:04Z|00654|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:36:04 np0005466012 podman[246980]: 2025-10-02 12:36:04.181245537 +0000 UTC m=+0.051766052 container create 2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005466012 systemd[1]: Started libpod-conmon-2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e.scope.
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.229 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408564.2285416, eca0cfe5-4225-401d-b0b5-04244de9913a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.230 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:36:04 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:36:04 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732155d96bf2085c1dfe00cfb69bf5833f4c8d86fa1877fdaf6afe62decdcdde/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:04 np0005466012 podman[246980]: 2025-10-02 12:36:04.154236995 +0000 UTC m=+0.024757530 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:36:04 np0005466012 podman[246980]: 2025-10-02 12:36:04.25253343 +0000 UTC m=+0.123053965 container init 2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:36:04 np0005466012 podman[246980]: 2025-10-02 12:36:04.257257122 +0000 UTC m=+0.127777637 container start 2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.261 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.271 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408564.229781, eca0cfe5-4225-401d-b0b5-04244de9913a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.272 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:36:04 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [NOTICE]   (246999) : New worker (247001) forked
Oct  2 08:36:04 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [NOTICE]   (246999) : Loading success.
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.307 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.310 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.320 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a in datapath 4522b631-3a21-451f-8605-7c2b34273ecd unbound from our chassis#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.322 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4522b631-3a21-451f-8605-7c2b34273ecd#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.332 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5745d2-1d5b-47a4-9ef1-9bed13d24309]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.333 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4522b631-31 in ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.334 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4522b631-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.334 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[66a5449f-e848-4d7e-8e3b-faf77972ddac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.335 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.335 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[703595a9-806d-4eb4-9caf-b18c02f6952f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.345 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[aeafdf51-8cf1-41f1-8c74-72ad2f004efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.368 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d5ea17-1012-4f89-ac52-0e8d347b5182]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.396 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[07fe606e-660a-4d53-b371-2372dcb3435b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.402 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1a12e4-d196-4be9-bab4-96c3df47af15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 NetworkManager[51207]: <info>  [1759408564.4035] manager: (tap4522b631-30): new Veth device (/org/freedesktop/NetworkManager/Devices/305)
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.419 2 DEBUG nova.compute.manager [req-bf4c610a-1bb7-489c-8f5a-fb29adb82c6f req-5540c712-df4e-4c0b-8b1a-e57474b52d78 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.420 2 DEBUG oslo_concurrency.lockutils [req-bf4c610a-1bb7-489c-8f5a-fb29adb82c6f req-5540c712-df4e-4c0b-8b1a-e57474b52d78 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.421 2 DEBUG oslo_concurrency.lockutils [req-bf4c610a-1bb7-489c-8f5a-fb29adb82c6f req-5540c712-df4e-4c0b-8b1a-e57474b52d78 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.421 2 DEBUG oslo_concurrency.lockutils [req-bf4c610a-1bb7-489c-8f5a-fb29adb82c6f req-5540c712-df4e-4c0b-8b1a-e57474b52d78 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.422 2 DEBUG nova.compute.manager [req-bf4c610a-1bb7-489c-8f5a-fb29adb82c6f req-5540c712-df4e-4c0b-8b1a-e57474b52d78 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Processing event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.431 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2be8ff8e-ddd7-4ac8-bd94-1794a3c69184]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.434 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[8e14cc59-1c96-42f7-9935-25614c4958b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 NetworkManager[51207]: <info>  [1759408564.4547] device (tap4522b631-30): carrier: link connected
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.460 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[afeb0c2d-41dd-4db6-9ee4-e493e382d490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.476 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd89871-a15c-4b21-9ca4-6ab262776e98]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4522b631-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:e8:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655806, 'reachable_time': 31629, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247020, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.491 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[605a741f-52e4-4acd-ba85-691da9905ec6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:e8b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655806, 'tstamp': 655806}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247021, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.509 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d354cc15-326d-481e-bb60-6084849ac53f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4522b631-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:e8:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655806, 'reachable_time': 31629, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247022, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.539 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[baacaa00-056a-4b5f-9b1b-3160ab67069a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.569 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c1340cb9-da89-42c7-b6af-637e21eea9bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.570 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4522b631-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.571 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.571 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4522b631-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005466012 NetworkManager[51207]: <info>  [1759408564.5733] manager: (tap4522b631-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Oct  2 08:36:04 np0005466012 kernel: tap4522b631-30: entered promiscuous mode
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.577 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4522b631-30, col_values=(('external_ids', {'iface-id': '0965d19d-88e4-4971-9eb7-5bedfed08cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:04Z|00655|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.581 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4522b631-3a21-451f-8605-7c2b34273ecd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4522b631-3a21-451f-8605-7c2b34273ecd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.582 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1fdb68-dab8-4c5f-bd7a-53ca2bd0a9e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.582 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-4522b631-3a21-451f-8605-7c2b34273ecd
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/4522b631-3a21-451f-8605-7c2b34273ecd.pid.haproxy
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 4522b631-3a21-451f-8605-7c2b34273ecd
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:36:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:04.583 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'env', 'PROCESS_TAG=haproxy-4522b631-3a21-451f-8605-7c2b34273ecd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4522b631-3a21-451f-8605-7c2b34273ecd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:36:04 np0005466012 nova_compute[192063]: 2025-10-02 12:36:04.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:04 np0005466012 podman[247052]: 2025-10-02 12:36:04.921462951 +0000 UTC m=+0.042835603 container create f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:36:04 np0005466012 systemd[1]: Started libpod-conmon-f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c.scope.
Oct  2 08:36:04 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:36:04 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e7573689bf8de3b6c10ff7f5f2c01a584f03e9d98d9febba79868321d9a9f3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:36:04 np0005466012 podman[247052]: 2025-10-02 12:36:04.899036997 +0000 UTC m=+0.020409639 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:36:05 np0005466012 podman[247052]: 2025-10-02 12:36:05.003660007 +0000 UTC m=+0.125032669 container init f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 08:36:05 np0005466012 podman[247052]: 2025-10-02 12:36:05.008889493 +0000 UTC m=+0.130262125 container start f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:36:05 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [NOTICE]   (247071) : New worker (247073) forked
Oct  2 08:36:05 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [NOTICE]   (247071) : Loading success.
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.282 2 DEBUG nova.network.neutron [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updated VIF entry in instance network info cache for port 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.283 2 DEBUG nova.network.neutron [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.298 2 DEBUG oslo_concurrency.lockutils [req-e456c085-13d6-44a0-b1d9-6b1259bb94bb req-199987d0-48f2-411b-a181-572e0bfd868c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.855 2 DEBUG nova.compute.manager [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.855 2 DEBUG oslo_concurrency.lockutils [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.856 2 DEBUG oslo_concurrency.lockutils [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.856 2 DEBUG oslo_concurrency.lockutils [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.856 2 DEBUG nova.compute.manager [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Processing event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.856 2 DEBUG nova.compute.manager [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.856 2 DEBUG oslo_concurrency.lockutils [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.857 2 DEBUG oslo_concurrency.lockutils [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.857 2 DEBUG oslo_concurrency.lockutils [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.857 2 DEBUG nova.compute.manager [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] No waiting events found dispatching network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.857 2 WARNING nova.compute.manager [req-3fd0b61c-e3a0-4720-8404-2f2024a1fd20 req-a70e00d2-78a7-4e75-a4de-c7c8f2e71625 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received unexpected event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a for instance with vm_state building and task_state spawning.#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.858 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.862 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.863 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408565.862056, eca0cfe5-4225-401d-b0b5-04244de9913a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.863 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.867 2 INFO nova.virt.libvirt.driver [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Instance spawned successfully.#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.867 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.890 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.896 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.899 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.899 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.900 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.900 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.900 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.901 2 DEBUG nova.virt.libvirt.driver [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.930 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.995 2 INFO nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Took 19.95 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:36:05 np0005466012 nova_compute[192063]: 2025-10-02 12:36:05.996 2 DEBUG nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.084 2 INFO nova.compute.manager [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Took 20.98 seconds to build instance.#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.112 2 DEBUG oslo_concurrency.lockutils [None req-48bf901a-9b16-4d7b-b7d2-44281f6f36fc 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.559 2 DEBUG nova.compute.manager [req-6951e708-309d-4173-bbf7-3f9b7e65473b req-2495d8cf-f54b-4703-949e-e48f00ae4627 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.560 2 DEBUG oslo_concurrency.lockutils [req-6951e708-309d-4173-bbf7-3f9b7e65473b req-2495d8cf-f54b-4703-949e-e48f00ae4627 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.560 2 DEBUG oslo_concurrency.lockutils [req-6951e708-309d-4173-bbf7-3f9b7e65473b req-2495d8cf-f54b-4703-949e-e48f00ae4627 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.560 2 DEBUG oslo_concurrency.lockutils [req-6951e708-309d-4173-bbf7-3f9b7e65473b req-2495d8cf-f54b-4703-949e-e48f00ae4627 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.561 2 DEBUG nova.compute.manager [req-6951e708-309d-4173-bbf7-3f9b7e65473b req-2495d8cf-f54b-4703-949e-e48f00ae4627 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] No waiting events found dispatching network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:06 np0005466012 nova_compute[192063]: 2025-10-02 12:36:06.561 2 WARNING nova.compute.manager [req-6951e708-309d-4173-bbf7-3f9b7e65473b req-2495d8cf-f54b-4703-949e-e48f00ae4627 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received unexpected event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:36:07 np0005466012 nova_compute[192063]: 2025-10-02 12:36:07.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:07 np0005466012 nova_compute[192063]: 2025-10-02 12:36:07.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:08 np0005466012 nova_compute[192063]: 2025-10-02 12:36:08.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:11 np0005466012 nova_compute[192063]: 2025-10-02 12:36:11.465 2 DEBUG nova.compute.manager [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-changed-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:11 np0005466012 nova_compute[192063]: 2025-10-02 12:36:11.466 2 DEBUG nova.compute.manager [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing instance network info cache due to event network-changed-10442e43-8bbf-4027-a5d1-25e81de68240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:11 np0005466012 nova_compute[192063]: 2025-10-02 12:36:11.467 2 DEBUG oslo_concurrency.lockutils [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:11 np0005466012 nova_compute[192063]: 2025-10-02 12:36:11.467 2 DEBUG oslo_concurrency.lockutils [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:11 np0005466012 nova_compute[192063]: 2025-10-02 12:36:11.467 2 DEBUG nova.network.neutron [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing network info cache for port 10442e43-8bbf-4027-a5d1-25e81de68240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:12 np0005466012 nova_compute[192063]: 2025-10-02 12:36:12.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:12 np0005466012 nova_compute[192063]: 2025-10-02 12:36:12.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:14 np0005466012 nova_compute[192063]: 2025-10-02 12:36:14.178 2 DEBUG nova.network.neutron [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updated VIF entry in instance network info cache for port 10442e43-8bbf-4027-a5d1-25e81de68240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:14 np0005466012 nova_compute[192063]: 2025-10-02 12:36:14.180 2 DEBUG nova.network.neutron [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:14 np0005466012 nova_compute[192063]: 2025-10-02 12:36:14.213 2 DEBUG oslo_concurrency.lockutils [req-4ca2848f-b535-4ea0-80ae-be0304b15cad req-f7c57d61-e07c-4c6d-925c-0f6b800c556d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:14 np0005466012 nova_compute[192063]: 2025-10-02 12:36:14.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:16 np0005466012 nova_compute[192063]: 2025-10-02 12:36:16.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:16.329 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:16.330 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:36:17 np0005466012 podman[247082]: 2025-10-02 12:36:17.141020226 +0000 UTC m=+0.047782521 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:36:17 np0005466012 podman[247083]: 2025-10-02 12:36:17.200828499 +0000 UTC m=+0.103584683 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:36:17 np0005466012 nova_compute[192063]: 2025-10-02 12:36:17.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:17 np0005466012 nova_compute[192063]: 2025-10-02 12:36:17.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:19 np0005466012 podman[247143]: 2025-10-02 12:36:19.151256933 +0000 UTC m=+0.058173840 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:36:19 np0005466012 podman[247144]: 2025-10-02 12:36:19.166724183 +0000 UTC m=+0.062784447 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:19Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:1c:88 10.100.0.13
Oct  2 08:36:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:19Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:1c:88 10.100.0.13
Oct  2 08:36:22 np0005466012 nova_compute[192063]: 2025-10-02 12:36:22.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:22 np0005466012 nova_compute[192063]: 2025-10-02 12:36:22.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:25.333 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:27 np0005466012 nova_compute[192063]: 2025-10-02 12:36:27.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:27 np0005466012 nova_compute[192063]: 2025-10-02 12:36:27.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:29Z|00656|binding|INFO|Releasing lport 9e6245a0-6013-48e5-9e96-a79fafe59b6a from this chassis (sb_readonly=0)
Oct  2 08:36:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:29Z|00657|binding|INFO|Releasing lport 0965d19d-88e4-4971-9eb7-5bedfed08cdc from this chassis (sb_readonly=0)
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.920 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.921 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.922 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.923 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.923 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.942 2 INFO nova.compute.manager [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Terminating instance#033[00m
Oct  2 08:36:29 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.959 2 DEBUG nova.compute.manager [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:36:29 np0005466012 kernel: tap10442e43-8b (unregistering): left promiscuous mode
Oct  2 08:36:29 np0005466012 NetworkManager[51207]: <info>  [1759408589.9885] device (tap10442e43-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:29Z|00658|binding|INFO|Releasing lport 10442e43-8bbf-4027-a5d1-25e81de68240 from this chassis (sb_readonly=0)
Oct  2 08:36:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:30Z|00659|binding|INFO|Setting lport 10442e43-8bbf-4027-a5d1-25e81de68240 down in Southbound
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:29.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:30Z|00660|binding|INFO|Removing iface tap10442e43-8b ovn-installed in OVS
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.010 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:1c:88 10.100.0.13'], port_security=['fa:16:3e:6d:1c:88 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'eca0cfe5-4225-401d-b0b5-04244de9913a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4660930-bac7-4d92-b95e-2296da9c1763, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=10442e43-8bbf-4027-a5d1-25e81de68240) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.011 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 10442e43-8bbf-4027-a5d1-25e81de68240 in datapath 385e0a9e-c250-418d-8cab-e7e3ae4506c1 unbound from our chassis#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.013 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 385e0a9e-c250-418d-8cab-e7e3ae4506c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.014 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[67b973c1-8d3f-4866-84dd-815764a4fd7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.015 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 namespace which is not needed anymore#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 kernel: tap5ccc67ed-7e (unregistering): left promiscuous mode
Oct  2 08:36:30 np0005466012 NetworkManager[51207]: <info>  [1759408590.0279] device (tap5ccc67ed-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:36:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:30Z|00661|binding|INFO|Releasing lport 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a from this chassis (sb_readonly=0)
Oct  2 08:36:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:30Z|00662|binding|INFO|Setting lport 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a down in Southbound
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:36:30Z|00663|binding|INFO|Removing iface tap5ccc67ed-7e ovn-installed in OVS
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.043 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:2d:ee 2001:db8::f816:3eff:fe48:2dee'], port_security=['fa:16:3e:48:2d:ee 2001:db8::f816:3eff:fe48:2dee'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe48:2dee/64', 'neutron:device_id': 'eca0cfe5-4225-401d-b0b5-04244de9913a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4522b631-3a21-451f-8605-7c2b34273ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31537a48-e4ed-4e85-9383-0c91e41b0f96', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09de0cb2-1ed0-42b1-8efb-533f84345cc8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Oct  2 08:36:30 np0005466012 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009e.scope: Consumed 13.790s CPU time.
Oct  2 08:36:30 np0005466012 systemd-machined[152114]: Machine qemu-74-instance-0000009e terminated.
Oct  2 08:36:30 np0005466012 podman[247180]: 2025-10-02 12:36:30.099513701 +0000 UTC m=+0.079753590 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:36:30 np0005466012 podman[247183]: 2025-10-02 12:36:30.13111027 +0000 UTC m=+0.102039810 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [NOTICE]   (246999) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [NOTICE]   (246999) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [WARNING]  (246999) : Exiting Master process...
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [ALERT]    (246999) : Current worker (247001) exited with code 143 (Terminated)
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1[246995]: [WARNING]  (246999) : All workers exited. Exiting... (0)
Oct  2 08:36:30 np0005466012 systemd[1]: libpod-2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e.scope: Deactivated successfully.
Oct  2 08:36:30 np0005466012 podman[247247]: 2025-10-02 12:36:30.173986042 +0000 UTC m=+0.049148067 container died 2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:36:30 np0005466012 kernel: tap10442e43-8b: entered promiscuous mode
Oct  2 08:36:30 np0005466012 kernel: tap10442e43-8b (unregistering): left promiscuous mode
Oct  2 08:36:30 np0005466012 NetworkManager[51207]: <info>  [1759408590.1813] manager: (tap10442e43-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/307)
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 NetworkManager[51207]: <info>  [1759408590.1913] manager: (tap5ccc67ed-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/308)
Oct  2 08:36:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-732155d96bf2085c1dfe00cfb69bf5833f4c8d86fa1877fdaf6afe62decdcdde-merged.mount: Deactivated successfully.
Oct  2 08:36:30 np0005466012 podman[247247]: 2025-10-02 12:36:30.219276473 +0000 UTC m=+0.094438488 container cleanup 2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:36:30 np0005466012 systemd[1]: libpod-conmon-2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e.scope: Deactivated successfully.
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.236 2 INFO nova.virt.libvirt.driver [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Instance destroyed successfully.#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.236 2 DEBUG nova.objects.instance [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid eca0cfe5-4225-401d-b0b5-04244de9913a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.250 2 DEBUG nova.virt.libvirt.vif [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:35:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-452628804',display_name='tempest-TestGettingAddress-server-452628804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-452628804',id=158,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-sq6akkeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:06Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=eca0cfe5-4225-401d-b0b5-04244de9913a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.251 2 DEBUG nova.network.os_vif_util [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.251 2 DEBUG nova.network.os_vif_util [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.251 2 DEBUG os_vif [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.254 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10442e43-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.266 2 INFO os_vif [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:1c:88,bridge_name='br-int',has_traffic_filtering=True,id=10442e43-8bbf-4027-a5d1-25e81de68240,network=Network(385e0a9e-c250-418d-8cab-e7e3ae4506c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10442e43-8b')#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.267 2 DEBUG nova.virt.libvirt.vif [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:35:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-452628804',display_name='tempest-TestGettingAddress-server-452628804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-452628804',id=158,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDH5dyE2Fb62QTiQ2NOxenp6JYghL1oq9DicssFh4gYhJeK4nW86De0EFSTiwMTso1Ry0yTYcOXXGoL5CP8YNZFtCow/YELoJPmiA0NwtFRUtcQ5PfX4mPVbmaNTOUOY/A==',key_name='tempest-TestGettingAddress-170538368',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:36:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-sq6akkeh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:36:06Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=eca0cfe5-4225-401d-b0b5-04244de9913a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.267 2 DEBUG nova.network.os_vif_util [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.268 2 DEBUG nova.network.os_vif_util [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.268 2 DEBUG os_vif [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.269 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5ccc67ed-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.274 2 INFO os_vif [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2d:ee,bridge_name='br-int',has_traffic_filtering=True,id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a,network=Network(4522b631-3a21-451f-8605-7c2b34273ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ccc67ed-7e')#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.274 2 INFO nova.virt.libvirt.driver [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Deleting instance files /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a_del#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.275 2 INFO nova.virt.libvirt.driver [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Deletion of /var/lib/nova/instances/eca0cfe5-4225-401d-b0b5-04244de9913a_del complete#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.283 2 DEBUG nova.compute.manager [req-87ede992-a026-41f3-b23e-83909fdb4c92 req-d28a836a-344f-44f2-a8a3-e177ba39915f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-unplugged-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.284 2 DEBUG oslo_concurrency.lockutils [req-87ede992-a026-41f3-b23e-83909fdb4c92 req-d28a836a-344f-44f2-a8a3-e177ba39915f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.284 2 DEBUG oslo_concurrency.lockutils [req-87ede992-a026-41f3-b23e-83909fdb4c92 req-d28a836a-344f-44f2-a8a3-e177ba39915f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.284 2 DEBUG oslo_concurrency.lockutils [req-87ede992-a026-41f3-b23e-83909fdb4c92 req-d28a836a-344f-44f2-a8a3-e177ba39915f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.284 2 DEBUG nova.compute.manager [req-87ede992-a026-41f3-b23e-83909fdb4c92 req-d28a836a-344f-44f2-a8a3-e177ba39915f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] No waiting events found dispatching network-vif-unplugged-10442e43-8bbf-4027-a5d1-25e81de68240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.284 2 DEBUG nova.compute.manager [req-87ede992-a026-41f3-b23e-83909fdb4c92 req-d28a836a-344f-44f2-a8a3-e177ba39915f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-unplugged-10442e43-8bbf-4027-a5d1-25e81de68240 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:30 np0005466012 podman[247294]: 2025-10-02 12:36:30.293077256 +0000 UTC m=+0.050141266 container remove 2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.298 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b488b8a8-2085-45f7-9256-dda2592dfc67]: (4, ('Thu Oct  2 12:36:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 (2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e)\n2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e\nThu Oct  2 12:36:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 (2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e)\n2db4d236956e629ce9c2c3fcc5d24f0325fe1541d8387ab2f6f5584c321e067e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.300 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[56816817-78c8-4c92-907e-a151ea1202db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.301 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e0a9e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 kernel: tap385e0a9e-c0: left promiscuous mode
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.330 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[70f9d777-ebb0-4e24-bb3c-6cce7e8dd7bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.369 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bde84bae-fdf2-445e-aba8-6aabc585991b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.370 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[db7a95d0-723b-441e-b1e7-1c2b330e80b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.377 2 INFO nova.compute.manager [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.378 2 DEBUG oslo.service.loopingcall [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.378 2 DEBUG nova.compute.manager [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.378 2 DEBUG nova.network.neutron [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.395 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c952b66f-799d-4490-95b8-6062a0e78c99]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655716, 'reachable_time': 37458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247311, 'error': None, 'target': 'ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 systemd[1]: run-netns-ovnmeta\x2d385e0a9e\x2dc250\x2d418d\x2d8cab\x2de7e3ae4506c1.mount: Deactivated successfully.
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.399 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-385e0a9e-c250-418d-8cab-e7e3ae4506c1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.399 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[02fc53cf-b833-43fe-8612-e6ac113be48d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.400 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a in datapath 4522b631-3a21-451f-8605-7c2b34273ecd unbound from our chassis#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.402 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4522b631-3a21-451f-8605-7c2b34273ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.403 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[877c5d00-5c98-4e1d-b517-7d45ea31c315]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.403 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd namespace which is not needed anymore#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.463 2 DEBUG nova.compute.manager [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-changed-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.464 2 DEBUG nova.compute.manager [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing instance network info cache due to event network-changed-10442e43-8bbf-4027-a5d1-25e81de68240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.464 2 DEBUG oslo_concurrency.lockutils [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.464 2 DEBUG oslo_concurrency.lockutils [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.464 2 DEBUG nova.network.neutron [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Refreshing network info cache for port 10442e43-8bbf-4027-a5d1-25e81de68240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [NOTICE]   (247071) : haproxy version is 2.8.14-c23fe91
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [NOTICE]   (247071) : path to executable is /usr/sbin/haproxy
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [WARNING]  (247071) : Exiting Master process...
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [ALERT]    (247071) : Current worker (247073) exited with code 143 (Terminated)
Oct  2 08:36:30 np0005466012 neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd[247067]: [WARNING]  (247071) : All workers exited. Exiting... (0)
Oct  2 08:36:30 np0005466012 systemd[1]: libpod-f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c.scope: Deactivated successfully.
Oct  2 08:36:30 np0005466012 podman[247329]: 2025-10-02 12:36:30.589124833 +0000 UTC m=+0.065606737 container died f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:36:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c-userdata-shm.mount: Deactivated successfully.
Oct  2 08:36:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-5e7573689bf8de3b6c10ff7f5f2c01a584f03e9d98d9febba79868321d9a9f3d-merged.mount: Deactivated successfully.
Oct  2 08:36:30 np0005466012 podman[247329]: 2025-10-02 12:36:30.621888764 +0000 UTC m=+0.098370658 container cleanup f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:36:30 np0005466012 systemd[1]: libpod-conmon-f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c.scope: Deactivated successfully.
Oct  2 08:36:30 np0005466012 podman[247357]: 2025-10-02 12:36:30.710994733 +0000 UTC m=+0.057908902 container remove f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.716 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ccde9c56-e2db-4c67-a80e-679552694ee2]: (4, ('Thu Oct  2 12:36:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd (f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c)\nf13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c\nThu Oct  2 12:36:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd (f13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c)\nf13c04017814bc59710152f6298b230ccdaf56574b67567de39dea4a4c599b2c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.718 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2901e3ae-36f1-4343-8ef3-9e6e317eeb73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.719 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4522b631-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 kernel: tap4522b631-30: left promiscuous mode
Oct  2 08:36:30 np0005466012 nova_compute[192063]: 2025-10-02 12:36:30.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.749 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[15d2190a-00eb-4d40-ac62-6b7f2193bf6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.777 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[32a0cc47-fa31-4838-9c81-bb0058932733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.778 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d222488d-b2da-4d93-be20-33d2b3a00ea1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.797 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[88dde488-90f4-4b2a-a91b-b2700c315893]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655800, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247374, 'error': None, 'target': 'ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.799 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4522b631-3a21-451f-8605-7c2b34273ecd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:36:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:36:30.799 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdab210-a4e7-45eb-a78d-7948b51bd293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:36:31 np0005466012 systemd[1]: run-netns-ovnmeta\x2d4522b631\x2d3a21\x2d451f\x2d8605\x2d7c2b34273ecd.mount: Deactivated successfully.
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.080 2 DEBUG nova.network.neutron [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.106 2 INFO nova.compute.manager [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Took 1.73 seconds to deallocate network for instance.#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.209 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.209 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.242 2 DEBUG nova.network.neutron [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updated VIF entry in instance network info cache for port 10442e43-8bbf-4027-a5d1-25e81de68240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.243 2 DEBUG nova.network.neutron [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "address": "fa:16:3e:48:2d:ee", "network": {"id": "4522b631-3a21-451f-8605-7c2b34273ecd", "bridge": "br-int", "label": "tempest-network-smoke--123996893", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe48:2dee", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ccc67ed-7e", "ovs_interfaceid": "5ccc67ed-7e13-40f0-81ad-953d3b80cb0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.287 2 DEBUG nova.compute.provider_tree [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.344 2 DEBUG oslo_concurrency.lockutils [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-eca0cfe5-4225-401d-b0b5-04244de9913a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.345 2 DEBUG nova.compute.manager [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-unplugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.346 2 DEBUG oslo_concurrency.lockutils [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.347 2 DEBUG oslo_concurrency.lockutils [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.347 2 DEBUG oslo_concurrency.lockutils [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.348 2 DEBUG nova.compute.manager [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] No waiting events found dispatching network-vif-unplugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.349 2 DEBUG nova.compute.manager [req-7c6e357a-a2ce-4097-a49e-f7af83626ea6 req-461745fb-f56f-42a5-99cf-9f90c01c5e83 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-unplugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.351 2 DEBUG nova.scheduler.client.report [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.399 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.454 2 INFO nova.scheduler.client.report [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance eca0cfe5-4225-401d-b0b5-04244de9913a#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.506 2 DEBUG nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.506 2 DEBUG oslo_concurrency.lockutils [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.507 2 DEBUG oslo_concurrency.lockutils [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.508 2 DEBUG oslo_concurrency.lockutils [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.508 2 DEBUG nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] No waiting events found dispatching network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.509 2 WARNING nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received unexpected event network-vif-plugged-10442e43-8bbf-4027-a5d1-25e81de68240 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.509 2 DEBUG nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-deleted-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.510 2 INFO nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Neutron deleted interface 5ccc67ed-7e13-40f0-81ad-953d3b80cb0a; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.510 2 DEBUG nova.network.neutron [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [{"id": "10442e43-8bbf-4027-a5d1-25e81de68240", "address": "fa:16:3e:6d:1c:88", "network": {"id": "385e0a9e-c250-418d-8cab-e7e3ae4506c1", "bridge": "br-int", "label": "tempest-network-smoke--772702849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10442e43-8b", "ovs_interfaceid": "10442e43-8bbf-4027-a5d1-25e81de68240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.551 2 DEBUG nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Detach interface failed, port_id=5ccc67ed-7e13-40f0-81ad-953d3b80cb0a, reason: Instance eca0cfe5-4225-401d-b0b5-04244de9913a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.552 2 DEBUG nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-deleted-10442e43-8bbf-4027-a5d1-25e81de68240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.552 2 INFO nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Neutron deleted interface 10442e43-8bbf-4027-a5d1-25e81de68240; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.552 2 DEBUG nova.network.neutron [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.559 2 DEBUG oslo_concurrency.lockutils [None req-89617662-33ed-4151-a58a-4ac90f4a9ba5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.583 2 DEBUG nova.compute.manager [req-62770c6e-6c73-484e-89fa-96c20d9782dd req-127a128d-0afc-4834-b899-45c9d19a8dd4 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Detach interface failed, port_id=10442e43-8bbf-4027-a5d1-25e81de68240, reason: Instance eca0cfe5-4225-401d-b0b5-04244de9913a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.627 2 DEBUG nova.compute.manager [req-be1ab6fd-1075-4470-afc5-14e98c744e46 req-9c75f24b-f1af-445e-879b-ff2fe4260973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.628 2 DEBUG oslo_concurrency.lockutils [req-be1ab6fd-1075-4470-afc5-14e98c744e46 req-9c75f24b-f1af-445e-879b-ff2fe4260973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.628 2 DEBUG oslo_concurrency.lockutils [req-be1ab6fd-1075-4470-afc5-14e98c744e46 req-9c75f24b-f1af-445e-879b-ff2fe4260973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.629 2 DEBUG oslo_concurrency.lockutils [req-be1ab6fd-1075-4470-afc5-14e98c744e46 req-9c75f24b-f1af-445e-879b-ff2fe4260973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "eca0cfe5-4225-401d-b0b5-04244de9913a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.629 2 DEBUG nova.compute.manager [req-be1ab6fd-1075-4470-afc5-14e98c744e46 req-9c75f24b-f1af-445e-879b-ff2fe4260973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] No waiting events found dispatching network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.629 2 WARNING nova.compute.manager [req-be1ab6fd-1075-4470-afc5-14e98c744e46 req-9c75f24b-f1af-445e-879b-ff2fe4260973 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Received unexpected event network-vif-plugged-5ccc67ed-7e13-40f0-81ad-953d3b80cb0a for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:36:32 np0005466012 nova_compute[192063]: 2025-10-02 12:36:32.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:34 np0005466012 podman[247375]: 2025-10-02 12:36:34.164299078 +0000 UTC m=+0.073277560 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:36:34 np0005466012 podman[247376]: 2025-10-02 12:36:34.173518134 +0000 UTC m=+0.075873492 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:36:34 np0005466012 nova_compute[192063]: 2025-10-02 12:36:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:34 np0005466012 nova_compute[192063]: 2025-10-02 12:36:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:35 np0005466012 nova_compute[192063]: 2025-10-02 12:36:35.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466012 nova_compute[192063]: 2025-10-02 12:36:35.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:35 np0005466012 nova_compute[192063]: 2025-10-02 12:36:35.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:37 np0005466012 nova_compute[192063]: 2025-10-02 12:36:37.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:37 np0005466012 nova_compute[192063]: 2025-10-02 12:36:37.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:40 np0005466012 nova_compute[192063]: 2025-10-02 12:36:40.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:41 np0005466012 nova_compute[192063]: 2025-10-02 12:36:41.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:42 np0005466012 nova_compute[192063]: 2025-10-02 12:36:42.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:42 np0005466012 nova_compute[192063]: 2025-10-02 12:36:42.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.234 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408590.2338476, eca0cfe5-4225-401d-b0b5-04244de9913a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.235 2 INFO nova.compute.manager [-] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.973 2 DEBUG nova.compute.manager [None req-707ef2f9-48b1-43e7-ae92-a3d0e81eccf7 - - - - - -] [instance: eca0cfe5-4225-401d-b0b5-04244de9913a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.979 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.980 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.980 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:45 np0005466012 nova_compute[192063]: 2025-10-02 12:36:45.980 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.137 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.138 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5717MB free_disk=73.24298858642578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.138 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.138 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.211 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.211 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.226 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.251 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.252 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.278 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.325 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.351 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.404 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.490 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:36:46 np0005466012 nova_compute[192063]: 2025-10-02 12:36:46.490 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.486 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.532 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.533 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.533 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.556 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:36:47 np0005466012 nova_compute[192063]: 2025-10-02 12:36:47.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:48 np0005466012 podman[247423]: 2025-10-02 12:36:48.164830342 +0000 UTC m=+0.085709936 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:36:48 np0005466012 podman[247424]: 2025-10-02 12:36:48.197609293 +0000 UTC m=+0.111373260 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:36:50 np0005466012 podman[247473]: 2025-10-02 12:36:50.132819503 +0000 UTC m=+0.051769100 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  2 08:36:50 np0005466012 podman[247474]: 2025-10-02 12:36:50.145899597 +0000 UTC m=+0.058996551 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:36:50 np0005466012 nova_compute[192063]: 2025-10-02 12:36:50.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:52 np0005466012 nova_compute[192063]: 2025-10-02 12:36:52.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:55 np0005466012 nova_compute[192063]: 2025-10-02 12:36:55.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:36:57 np0005466012 nova_compute[192063]: 2025-10-02 12:36:57.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:00 np0005466012 nova_compute[192063]: 2025-10-02 12:37:00.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:01 np0005466012 podman[247510]: 2025-10-02 12:37:01.145710257 +0000 UTC m=+0.061651566 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:37:01 np0005466012 podman[247511]: 2025-10-02 12:37:01.147355663 +0000 UTC m=+0.057903082 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal)
Oct  2 08:37:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:02.152 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:02.153 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:02.153 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:02 np0005466012 nova_compute[192063]: 2025-10-02 12:37:02.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:05 np0005466012 podman[247551]: 2025-10-02 12:37:05.149650523 +0000 UTC m=+0.061853023 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:37:05 np0005466012 podman[247552]: 2025-10-02 12:37:05.200472516 +0000 UTC m=+0.096455214 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:37:05 np0005466012 nova_compute[192063]: 2025-10-02 12:37:05.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:07 np0005466012 nova_compute[192063]: 2025-10-02 12:37:07.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:10 np0005466012 nova_compute[192063]: 2025-10-02 12:37:10.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:12 np0005466012 nova_compute[192063]: 2025-10-02 12:37:12.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.570 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.570 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.606 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.723 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.724 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.731 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:37:13 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.732 2 INFO nova.compute.claims [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:37:13 np0005466012 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.878 2 DEBUG nova.compute.provider_tree [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.893 2 DEBUG nova.scheduler.client.report [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.915 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.916 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.970 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.971 2 DEBUG nova.network.neutron [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:37:13 np0005466012 nova_compute[192063]: 2025-10-02 12:37:13.992 2 INFO nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.010 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.137 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.138 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.139 2 INFO nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Creating image(s)#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.139 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "/var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.140 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "/var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.140 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "/var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.153 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.208 2 DEBUG nova.policy [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d07868c23de4edc9018d8964b43d954', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f7d693b90ba447196796435b74590f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.212 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.213 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.214 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.224 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.274 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.275 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.304 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.305 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.306 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.379 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.380 2 DEBUG nova.virt.disk.api [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Checking if we can resize image /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.381 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.450 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.451 2 DEBUG nova.virt.disk.api [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Cannot resize image /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.452 2 DEBUG nova.objects.instance [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lazy-loading 'migration_context' on Instance uuid a489cbb2-1400-41b4-9345-18186b74b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.467 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.467 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Ensure instance console log exists: /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.468 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.468 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:14 np0005466012 nova_compute[192063]: 2025-10-02 12:37:14.468 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:14.520 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2 2001:db8::f816:3eff:fefb:624a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fefb:624a/64', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e78bd1c4-7546-4ebe-a71b-a49e8c78f36c) old=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:14.522 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e78bd1c4-7546-4ebe-a71b-a49e8c78f36c in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 updated#033[00m
Oct  2 08:37:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:14.523 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:37:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:14.524 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4aacf950-e2a8-412e-8328-3649a408873a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:15 np0005466012 nova_compute[192063]: 2025-10-02 12:37:15.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:15 np0005466012 nova_compute[192063]: 2025-10-02 12:37:15.917 2 DEBUG nova.network.neutron [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Successfully created port: d184278c-aebf-44ae-b916-815cb5979416 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:37:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:16.560 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:16 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:16.561 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:37:16 np0005466012 nova_compute[192063]: 2025-10-02 12:37:16.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:16 np0005466012 nova_compute[192063]: 2025-10-02 12:37:16.894 2 DEBUG nova.network.neutron [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Successfully updated port: d184278c-aebf-44ae-b916-815cb5979416 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:37:16 np0005466012 nova_compute[192063]: 2025-10-02 12:37:16.913 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:16 np0005466012 nova_compute[192063]: 2025-10-02 12:37:16.914 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquired lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:16 np0005466012 nova_compute[192063]: 2025-10-02 12:37:16.914 2 DEBUG nova.network.neutron [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.926 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:37:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:37:17 np0005466012 nova_compute[192063]: 2025-10-02 12:37:17.025 2 DEBUG nova.compute.manager [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-changed-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:17 np0005466012 nova_compute[192063]: 2025-10-02 12:37:17.026 2 DEBUG nova.compute.manager [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Refreshing instance network info cache due to event network-changed-d184278c-aebf-44ae-b916-815cb5979416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:17 np0005466012 nova_compute[192063]: 2025-10-02 12:37:17.027 2 DEBUG oslo_concurrency.lockutils [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:17 np0005466012 nova_compute[192063]: 2025-10-02 12:37:17.137 2 DEBUG nova.network.neutron [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:37:17 np0005466012 nova_compute[192063]: 2025-10-02 12:37:17.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.457 2 DEBUG nova.network.neutron [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updating instance_info_cache with network_info: [{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.477 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Releasing lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.477 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Instance network_info: |[{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.478 2 DEBUG oslo_concurrency.lockutils [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.478 2 DEBUG nova.network.neutron [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Refreshing network info cache for port d184278c-aebf-44ae-b916-815cb5979416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.481 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Start _get_guest_xml network_info=[{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.487 2 WARNING nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.491 2 DEBUG nova.virt.libvirt.host [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.492 2 DEBUG nova.virt.libvirt.host [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.495 2 DEBUG nova.virt.libvirt.host [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.496 2 DEBUG nova.virt.libvirt.host [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.497 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.498 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.498 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.499 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.499 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.500 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.500 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.500 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.501 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.501 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.501 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.502 2 DEBUG nova.virt.hardware [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.506 2 DEBUG nova.virt.libvirt.vif [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1471171623',display_name='tempest-TestSnapshotPattern-server-1471171623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1471171623',id=161,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzJGGGUE+Xks9+aY5SjFk2n2DGAnXfOBhkbeNeuAVWQ/dQZsUYNFa4aU04DL6V5Ahv7YBoVwhzJt5xloq0NtgboR41kXTeWdHADR0n2ucoHL3yxU4d4gs2dS5flZPM85w==',key_name='tempest-TestSnapshotPattern-331136498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f7d693b90ba447196796435b74590f6',ramdisk_id='',reservation_id='r-9t63tjfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1950942920',owner_user_name='tempest-TestSnapshotPattern-1950942920-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:14Z,user_data=None,user_id='6d07868c23de4edc9018d8964b43d954',uuid=a489cbb2-1400-41b4-9345-18186b74b900,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.507 2 DEBUG nova.network.os_vif_util [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Converting VIF {"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.508 2 DEBUG nova.network.os_vif_util [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.509 2 DEBUG nova.objects.instance [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid a489cbb2-1400-41b4-9345-18186b74b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.525 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <uuid>a489cbb2-1400-41b4-9345-18186b74b900</uuid>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <name>instance-000000a1</name>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestSnapshotPattern-server-1471171623</nova:name>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:37:18</nova:creationTime>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:user uuid="6d07868c23de4edc9018d8964b43d954">tempest-TestSnapshotPattern-1950942920-project-member</nova:user>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:project uuid="8f7d693b90ba447196796435b74590f6">tempest-TestSnapshotPattern-1950942920</nova:project>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        <nova:port uuid="d184278c-aebf-44ae-b916-815cb5979416">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <entry name="serial">a489cbb2-1400-41b4-9345-18186b74b900</entry>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <entry name="uuid">a489cbb2-1400-41b4-9345-18186b74b900</entry>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.config"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:15:94:15"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <target dev="tapd184278c-ae"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/console.log" append="off"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:37:18 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:37:18 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:37:18 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:37:18 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.526 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Preparing to wait for external event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.527 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.527 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.527 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.528 2 DEBUG nova.virt.libvirt.vif [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1471171623',display_name='tempest-TestSnapshotPattern-server-1471171623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1471171623',id=161,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzJGGGUE+Xks9+aY5SjFk2n2DGAnXfOBhkbeNeuAVWQ/dQZsUYNFa4aU04DL6V5Ahv7YBoVwhzJt5xloq0NtgboR41kXTeWdHADR0n2ucoHL3yxU4d4gs2dS5flZPM85w==',key_name='tempest-TestSnapshotPattern-331136498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f7d693b90ba447196796435b74590f6',ramdisk_id='',reservation_id='r-9t63tjfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1950942920',owner_user_name='tempest-TestSnapshotPattern-1950942920-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:14Z,user_data=None,user_id='6d07868c23de4edc9018d8964b43d954',uuid=a489cbb2-1400-41b4-9345-18186b74b900,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.528 2 DEBUG nova.network.os_vif_util [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Converting VIF {"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.529 2 DEBUG nova.network.os_vif_util [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.529 2 DEBUG os_vif [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.530 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.531 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.534 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd184278c-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.535 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd184278c-ae, col_values=(('external_ids', {'iface-id': 'd184278c-aebf-44ae-b916-815cb5979416', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:94:15', 'vm-uuid': 'a489cbb2-1400-41b4-9345-18186b74b900'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:18 np0005466012 NetworkManager[51207]: <info>  [1759408638.5396] manager: (tapd184278c-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.550 2 INFO os_vif [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae')#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.602 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.603 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.603 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] No VIF found with MAC fa:16:3e:15:94:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:37:18 np0005466012 nova_compute[192063]: 2025-10-02 12:37:18.604 2 INFO nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Using config drive#033[00m
Oct  2 08:37:19 np0005466012 podman[247615]: 2025-10-02 12:37:19.186664001 +0000 UTC m=+0.086766015 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:37:19 np0005466012 podman[247616]: 2025-10-02 12:37:19.188377249 +0000 UTC m=+0.091903748 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:37:19 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.615 2 INFO nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Creating config drive at /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.config#033[00m
Oct  2 08:37:19 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.624 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzaiph2nx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:19 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.771 2 DEBUG oslo_concurrency.processutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzaiph2nx" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:19 np0005466012 kernel: tapd184278c-ae: entered promiscuous mode
Oct  2 08:37:19 np0005466012 NetworkManager[51207]: <info>  [1759408639.8640] manager: (tapd184278c-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Oct  2 08:37:19 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:19Z|00664|binding|INFO|Claiming lport d184278c-aebf-44ae-b916-815cb5979416 for this chassis.
Oct  2 08:37:19 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:19Z|00665|binding|INFO|d184278c-aebf-44ae-b916-815cb5979416: Claiming fa:16:3e:15:94:15 10.100.0.8
Oct  2 08:37:19 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.926 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:94:15 10.100.0.8'], port_security=['fa:16:3e:15:94:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a489cbb2-1400-41b4-9345-18186b74b900', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95da58c1-265e-4dd9-ba00-692853005e46', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7d693b90ba447196796435b74590f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '35c2ff63-16f8-4b9e-8320-2301129fdf30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2037b9ce-d2e9-4c7b-b130-56e2abc95360, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d184278c-aebf-44ae-b916-815cb5979416) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.928 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d184278c-aebf-44ae-b916-815cb5979416 in datapath 95da58c1-265e-4dd9-ba00-692853005e46 bound to our chassis#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.930 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 95da58c1-265e-4dd9-ba00-692853005e46#033[00m
Oct  2 08:37:19 np0005466012 systemd-udevd[247684]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.944 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[26e814fd-44cb-4918-b876-97394f8643c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.945 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap95da58c1-21 in ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.947 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap95da58c1-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.947 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bc113c8c-a91e-43f5-9a19-38765f4e461b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.948 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[78f42461-8efa-468a-a810-cf22cd94c5ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:19 np0005466012 NetworkManager[51207]: <info>  [1759408639.9587] device (tapd184278c-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:37:19 np0005466012 systemd-machined[152114]: New machine qemu-75-instance-000000a1.
Oct  2 08:37:19 np0005466012 NetworkManager[51207]: <info>  [1759408639.9609] device (tapd184278c-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.963 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[462e5720-7236-406e-ba0e-bdbe80941cc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:19 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:19.996 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[259d995a-9c1d-4da8-a9d0-25383d2daf86]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:19.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:20Z|00666|binding|INFO|Setting lport d184278c-aebf-44ae-b916-815cb5979416 ovn-installed in OVS
Oct  2 08:37:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:20Z|00667|binding|INFO|Setting lport d184278c-aebf-44ae-b916-815cb5979416 up in Southbound
Oct  2 08:37:20 np0005466012 systemd[1]: Started Virtual Machine qemu-75-instance-000000a1.
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.031 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7e886659-31e5-41aa-92a1-c9ee21475224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 NetworkManager[51207]: <info>  [1759408640.0383] manager: (tap95da58c1-20): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.038 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[641bb8e4-823b-45dc-9198-530c4806b4e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.082 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5589361f-7b39-458d-aeae-a4bfa488e298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.084 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[1391d521-2813-449e-bd86-369234844464]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 NetworkManager[51207]: <info>  [1759408640.1085] device (tap95da58c1-20): carrier: link connected
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.112 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[7a64c368-0ebb-4bc9-ba67-fb5cd46459c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.126 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8abe2380-0069-469d-8083-9f03a72d834d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap95da58c1-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:75:f5:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 663372, 'reachable_time': 18362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247717, 'error': None, 'target': 'ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.138 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0e595e3a-b018-4b9d-a451-e23746a5502f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe75:f595'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 663372, 'tstamp': 663372}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247718, 'error': None, 'target': 'ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.159 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[989a94dc-6ae1-4d4e-ae56-a81d907a3c49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap95da58c1-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:75:f5:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 663372, 'reachable_time': 18362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247719, 'error': None, 'target': 'ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.189 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[686c9786-e289-457e-9af3-05ef062b34f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.244 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[139c1a49-2ef0-4589-a5f0-bee4d031eb7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.246 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95da58c1-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.246 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.246 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95da58c1-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:20 np0005466012 kernel: tap95da58c1-20: entered promiscuous mode
Oct  2 08:37:20 np0005466012 NetworkManager[51207]: <info>  [1759408640.2485] manager: (tap95da58c1-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.251 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap95da58c1-20, col_values=(('external_ids', {'iface-id': 'ad4e7082-9510-41a9-bc81-de2c66402e98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:20Z|00668|binding|INFO|Releasing lport ad4e7082-9510-41a9-bc81-de2c66402e98 from this chassis (sb_readonly=0)
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.264 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/95da58c1-265e-4dd9-ba00-692853005e46.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/95da58c1-265e-4dd9-ba00-692853005e46.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.264 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a555d6c3-e05a-4096-ae69-a4e31a8ac127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.265 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-95da58c1-265e-4dd9-ba00-692853005e46
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/95da58c1-265e-4dd9-ba00-692853005e46.pid.haproxy
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 95da58c1-265e-4dd9-ba00-692853005e46
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:37:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:20.267 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46', 'env', 'PROCESS_TAG=haproxy-95da58c1-265e-4dd9-ba00-692853005e46', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/95da58c1-265e-4dd9-ba00-692853005e46.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.376 2 DEBUG nova.compute.manager [req-2228b6c0-015e-47cc-a855-e43c4d87b1b3 req-ea8fa20e-a18e-4de7-935b-88e920991748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.377 2 DEBUG oslo_concurrency.lockutils [req-2228b6c0-015e-47cc-a855-e43c4d87b1b3 req-ea8fa20e-a18e-4de7-935b-88e920991748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.377 2 DEBUG oslo_concurrency.lockutils [req-2228b6c0-015e-47cc-a855-e43c4d87b1b3 req-ea8fa20e-a18e-4de7-935b-88e920991748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.377 2 DEBUG oslo_concurrency.lockutils [req-2228b6c0-015e-47cc-a855-e43c4d87b1b3 req-ea8fa20e-a18e-4de7-935b-88e920991748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:20 np0005466012 nova_compute[192063]: 2025-10-02 12:37:20.378 2 DEBUG nova.compute.manager [req-2228b6c0-015e-47cc-a855-e43c4d87b1b3 req-ea8fa20e-a18e-4de7-935b-88e920991748 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Processing event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:37:20 np0005466012 podman[247751]: 2025-10-02 12:37:20.703184554 +0000 UTC m=+0.079017281 container create 8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:37:20 np0005466012 systemd[1]: Started libpod-conmon-8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6.scope.
Oct  2 08:37:20 np0005466012 podman[247751]: 2025-10-02 12:37:20.673191359 +0000 UTC m=+0.049024156 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:37:20 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:37:20 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ed19f5596101e6a98c738ba523b6cc8d422e6f63c34dc0d1278cd76ced56b95/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:37:20 np0005466012 podman[247751]: 2025-10-02 12:37:20.788112916 +0000 UTC m=+0.163945673 container init 8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:37:20 np0005466012 podman[247751]: 2025-10-02 12:37:20.79439006 +0000 UTC m=+0.170222797 container start 8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:37:20 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [NOTICE]   (247803) : New worker (247808) forked
Oct  2 08:37:20 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [NOTICE]   (247803) : Loading success.
Oct  2 08:37:20 np0005466012 podman[247766]: 2025-10-02 12:37:20.823874571 +0000 UTC m=+0.075750008 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:37:20 np0005466012 podman[247767]: 2025-10-02 12:37:20.831826493 +0000 UTC m=+0.082943029 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.110 2 DEBUG nova.network.neutron [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updated VIF entry in instance network info cache for port d184278c-aebf-44ae-b916-815cb5979416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.111 2 DEBUG nova.network.neutron [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updating instance_info_cache with network_info: [{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.133 2 DEBUG oslo_concurrency.lockutils [req-056df2bf-2d6b-41dd-8137-5d696b47cfc0 req-0c3e8c40-9193-4303-a972-da5bc7fdf5ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:21.246 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2 2001:db8:0:1:f816:3eff:fefb:624a 2001:db8::f816:3eff:fefb:624a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fefb:624a/64 2001:db8::f816:3eff:fefb:624a/64', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e78bd1c4-7546-4ebe-a71b-a49e8c78f36c) old=Port_Binding(mac=['fa:16:3e:fb:62:4a 10.100.0.2 2001:db8::f816:3eff:fefb:624a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fefb:624a/64', 'neutron:device_id': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:21.248 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e78bd1c4-7546-4ebe-a71b-a49e8c78f36c in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 updated#033[00m
Oct  2 08:37:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:21.250 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:37:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:21.251 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a38ea4c4-6b62-4c91-ae0b-2aa7cc9b8d8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.541 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408641.5414374, a489cbb2-1400-41b4-9345-18186b74b900 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.542 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] VM Started (Lifecycle Event)#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.544 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.547 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.550 2 INFO nova.virt.libvirt.driver [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Instance spawned successfully.#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.550 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.571 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.574 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.575 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.575 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.576 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.576 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.577 2 DEBUG nova.virt.libvirt.driver [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.581 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.612 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.613 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408641.5415928, a489cbb2-1400-41b4-9345-18186b74b900 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.613 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.642 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.646 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408641.546426, a489cbb2-1400-41b4-9345-18186b74b900 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.646 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.662 2 INFO nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Took 7.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.662 2 DEBUG nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.666 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.674 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.716 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:37:21 np0005466012 nova_compute[192063]: 2025-10-02 12:37:21.788 2 INFO nova.compute.manager [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Took 8.10 seconds to build instance.#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.015 2 DEBUG oslo_concurrency.lockutils [None req-c5cb7a1d-593d-415d-9265-59decd89d832 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.642 2 DEBUG nova.compute.manager [req-7bc15741-3bc9-4365-bcae-9b0d17075c7c req-6912c3ce-9e70-456a-a309-fc01b94b31b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.643 2 DEBUG oslo_concurrency.lockutils [req-7bc15741-3bc9-4365-bcae-9b0d17075c7c req-6912c3ce-9e70-456a-a309-fc01b94b31b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.644 2 DEBUG oslo_concurrency.lockutils [req-7bc15741-3bc9-4365-bcae-9b0d17075c7c req-6912c3ce-9e70-456a-a309-fc01b94b31b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.644 2 DEBUG oslo_concurrency.lockutils [req-7bc15741-3bc9-4365-bcae-9b0d17075c7c req-6912c3ce-9e70-456a-a309-fc01b94b31b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.644 2 DEBUG nova.compute.manager [req-7bc15741-3bc9-4365-bcae-9b0d17075c7c req-6912c3ce-9e70-456a-a309-fc01b94b31b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] No waiting events found dispatching network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.645 2 WARNING nova.compute.manager [req-7bc15741-3bc9-4365-bcae-9b0d17075c7c req-6912c3ce-9e70-456a-a309-fc01b94b31b9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received unexpected event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:37:22 np0005466012 nova_compute[192063]: 2025-10-02 12:37:22.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:23 np0005466012 nova_compute[192063]: 2025-10-02 12:37:23.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:25 np0005466012 nova_compute[192063]: 2025-10-02 12:37:25.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:25 np0005466012 NetworkManager[51207]: <info>  [1759408645.4035] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Oct  2 08:37:25 np0005466012 NetworkManager[51207]: <info>  [1759408645.4062] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Oct  2 08:37:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:25Z|00669|binding|INFO|Releasing lport ad4e7082-9510-41a9-bc81-de2c66402e98 from this chassis (sb_readonly=0)
Oct  2 08:37:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:25Z|00670|binding|INFO|Releasing lport ad4e7082-9510-41a9-bc81-de2c66402e98 from this chassis (sb_readonly=0)
Oct  2 08:37:25 np0005466012 nova_compute[192063]: 2025-10-02 12:37:25.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:26.563 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:37:26 np0005466012 nova_compute[192063]: 2025-10-02 12:37:26.866 2 DEBUG nova.compute.manager [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-changed-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:37:26 np0005466012 nova_compute[192063]: 2025-10-02 12:37:26.868 2 DEBUG nova.compute.manager [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Refreshing instance network info cache due to event network-changed-d184278c-aebf-44ae-b916-815cb5979416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:37:26 np0005466012 nova_compute[192063]: 2025-10-02 12:37:26.868 2 DEBUG oslo_concurrency.lockutils [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:26 np0005466012 nova_compute[192063]: 2025-10-02 12:37:26.869 2 DEBUG oslo_concurrency.lockutils [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:26 np0005466012 nova_compute[192063]: 2025-10-02 12:37:26.870 2 DEBUG nova.network.neutron [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Refreshing network info cache for port d184278c-aebf-44ae-b916-815cb5979416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:37:27 np0005466012 nova_compute[192063]: 2025-10-02 12:37:27.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:28 np0005466012 nova_compute[192063]: 2025-10-02 12:37:28.101 2 DEBUG nova.network.neutron [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updated VIF entry in instance network info cache for port d184278c-aebf-44ae-b916-815cb5979416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:37:28 np0005466012 nova_compute[192063]: 2025-10-02 12:37:28.102 2 DEBUG nova.network.neutron [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updating instance_info_cache with network_info: [{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:28 np0005466012 nova_compute[192063]: 2025-10-02 12:37:28.133 2 DEBUG oslo_concurrency.lockutils [req-a474071e-c6ce-4fe2-b4f3-abbee5322c34 req-772ef9fa-f704-4094-b4ca-50f181b14a2e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:28 np0005466012 nova_compute[192063]: 2025-10-02 12:37:28.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:32 np0005466012 podman[247827]: 2025-10-02 12:37:32.149757023 +0000 UTC m=+0.056260996 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct  2 08:37:32 np0005466012 podman[247826]: 2025-10-02 12:37:32.158594019 +0000 UTC m=+0.065374491 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Oct  2 08:37:32 np0005466012 nova_compute[192063]: 2025-10-02 12:37:32.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:33 np0005466012 nova_compute[192063]: 2025-10-02 12:37:33.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:34 np0005466012 nova_compute[192063]: 2025-10-02 12:37:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:34 np0005466012 nova_compute[192063]: 2025-10-02 12:37:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:34Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:94:15 10.100.0.8
Oct  2 08:37:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:37:34Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:94:15 10.100.0.8
Oct  2 08:37:36 np0005466012 podman[247887]: 2025-10-02 12:37:36.133080195 +0000 UTC m=+0.049835888 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:37:36 np0005466012 podman[247886]: 2025-10-02 12:37:36.14977892 +0000 UTC m=+0.070442812 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:37:37 np0005466012 nova_compute[192063]: 2025-10-02 12:37:37.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:38 np0005466012 nova_compute[192063]: 2025-10-02 12:37:38.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:39 np0005466012 nova_compute[192063]: 2025-10-02 12:37:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:41 np0005466012 nova_compute[192063]: 2025-10-02 12:37:41.968 2 DEBUG nova.compute.manager [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.035 2 INFO nova.compute.manager [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] instance snapshotting#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.444 2 INFO nova.virt.libvirt.driver [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Beginning live snapshot process#033[00m
Oct  2 08:37:42 np0005466012 virtqemud[191783]: invalid argument: disk vda does not have an active block job
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.680 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.757 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json -f qcow2" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.758 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.851 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json -f qcow2" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.862 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.921 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.922 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprw8jrf2d/099cec798dfd472d8b4804bf695d777a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.954 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprw8jrf2d/099cec798dfd472d8b4804bf695d777a.delta 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:42 np0005466012 nova_compute[192063]: 2025-10-02 12:37:42.955 2 INFO nova.virt.libvirt.driver [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Oct  2 08:37:43 np0005466012 nova_compute[192063]: 2025-10-02 12:37:43.005 2 DEBUG nova.virt.libvirt.guest [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:37:43 np0005466012 nova_compute[192063]: 2025-10-02 12:37:43.508 2 DEBUG nova.virt.libvirt.guest [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Oct  2 08:37:43 np0005466012 nova_compute[192063]: 2025-10-02 12:37:43.511 2 INFO nova.virt.libvirt.driver [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Oct  2 08:37:43 np0005466012 nova_compute[192063]: 2025-10-02 12:37:43.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:43 np0005466012 nova_compute[192063]: 2025-10-02 12:37:43.559 2 DEBUG nova.privsep.utils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  2 08:37:43 np0005466012 nova_compute[192063]: 2025-10-02 12:37:43.559 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprw8jrf2d/099cec798dfd472d8b4804bf695d777a.delta /var/lib/nova/instances/snapshots/tmprw8jrf2d/099cec798dfd472d8b4804bf695d777a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:44 np0005466012 nova_compute[192063]: 2025-10-02 12:37:44.222 2 DEBUG oslo_concurrency.processutils [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprw8jrf2d/099cec798dfd472d8b4804bf695d777a.delta /var/lib/nova/instances/snapshots/tmprw8jrf2d/099cec798dfd472d8b4804bf695d777a" returned: 0 in 0.663s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:44 np0005466012 nova_compute[192063]: 2025-10-02 12:37:44.231 2 INFO nova.virt.libvirt.driver [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Snapshot extracted, beginning image upload#033[00m
Oct  2 08:37:44 np0005466012 nova_compute[192063]: 2025-10-02 12:37:44.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.850 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.850 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.851 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.852 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.921 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.985 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:45 np0005466012 nova_compute[192063]: 2025-10-02 12:37:45.986 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.047 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.223 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.225 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5527MB free_disk=73.16495895385742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.225 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.225 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.315 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance a489cbb2-1400-41b4-9345-18186b74b900 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.316 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.316 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.367 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.383 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.407 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.407 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.865 2 INFO nova.virt.libvirt.driver [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Snapshot image upload complete#033[00m
Oct  2 08:37:46 np0005466012 nova_compute[192063]: 2025-10-02 12:37:46.866 2 INFO nova.compute.manager [None req-7bf4a3bc-908d-4453-8c1a-3f0a6199d069 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Took 4.81 seconds to snapshot the instance on the hypervisor.#033[00m
Oct  2 08:37:47 np0005466012 nova_compute[192063]: 2025-10-02 12:37:47.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:48 np0005466012 nova_compute[192063]: 2025-10-02 12:37:48.408 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:48 np0005466012 nova_compute[192063]: 2025-10-02 12:37:48.409 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:37:48 np0005466012 nova_compute[192063]: 2025-10-02 12:37:48.409 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:37:48 np0005466012 nova_compute[192063]: 2025-10-02 12:37:48.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:49 np0005466012 nova_compute[192063]: 2025-10-02 12:37:49.086 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:37:49 np0005466012 nova_compute[192063]: 2025-10-02 12:37:49.087 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:37:49 np0005466012 nova_compute[192063]: 2025-10-02 12:37:49.087 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:37:49 np0005466012 nova_compute[192063]: 2025-10-02 12:37:49.087 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid a489cbb2-1400-41b4-9345-18186b74b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:37:50 np0005466012 podman[247966]: 2025-10-02 12:37:50.159487021 +0000 UTC m=+0.078572648 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  2 08:37:50 np0005466012 podman[247965]: 2025-10-02 12:37:50.16558336 +0000 UTC m=+0.082921548 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:37:51 np0005466012 nova_compute[192063]: 2025-10-02 12:37:51.119 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updating instance_info_cache with network_info: [{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:37:51 np0005466012 nova_compute[192063]: 2025-10-02 12:37:51.139 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:37:51 np0005466012 nova_compute[192063]: 2025-10-02 12:37:51.140 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:37:51 np0005466012 nova_compute[192063]: 2025-10-02 12:37:51.140 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:51 np0005466012 nova_compute[192063]: 2025-10-02 12:37:51.140 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:37:51 np0005466012 nova_compute[192063]: 2025-10-02 12:37:51.141 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:37:51 np0005466012 podman[248014]: 2025-10-02 12:37:51.143490726 +0000 UTC m=+0.056268066 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:37:51 np0005466012 podman[248013]: 2025-10-02 12:37:51.174446477 +0000 UTC m=+0.090446617 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:37:52 np0005466012 nova_compute[192063]: 2025-10-02 12:37:52.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:53 np0005466012 nova_compute[192063]: 2025-10-02 12:37:53.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:57 np0005466012 nova_compute[192063]: 2025-10-02 12:37:57.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:58 np0005466012 nova_compute[192063]: 2025-10-02 12:37:58.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.025 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.025 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.040 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.142 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.143 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.150 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.150 2 INFO nova.compute.claims [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.348 2 DEBUG nova.compute.provider_tree [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.370 2 DEBUG nova.scheduler.client.report [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.396 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.396 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.447 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.447 2 DEBUG nova.network.neutron [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.464 2 INFO nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.484 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.640 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.643 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.644 2 INFO nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Creating image(s)#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.645 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.646 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.647 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.663 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.696 2 DEBUG nova.policy [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.759 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.760 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.761 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.786 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.879 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.880 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.917 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.919 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.920 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:37:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:59.926 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:37:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:37:59.928 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.983 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.984 2 DEBUG nova.virt.disk.api [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:37:59 np0005466012 nova_compute[192063]: 2025-10-02 12:37:59.984 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.048 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.049 2 DEBUG nova.virt.disk.api [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.050 2 DEBUG nova.objects.instance [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 691219f9-8828-4b24-b920-b55514d93e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.080 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.081 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Ensure instance console log exists: /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.082 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.082 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.083 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:00 np0005466012 nova_compute[192063]: 2025-10-02 12:38:00.807 2 DEBUG nova.network.neutron [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Successfully created port: 440edd3a-154e-4e66-9b54-0719e50ef207 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.778 2 DEBUG nova.network.neutron [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Successfully updated port: 440edd3a-154e-4e66-9b54-0719e50ef207 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.799 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.799 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.800 2 DEBUG nova.network.neutron [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.866 2 DEBUG nova.compute.manager [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-changed-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.866 2 DEBUG nova.compute.manager [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Refreshing instance network info cache due to event network-changed-440edd3a-154e-4e66-9b54-0719e50ef207. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.867 2 DEBUG oslo_concurrency.lockutils [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:01 np0005466012 nova_compute[192063]: 2025-10-02 12:38:01.941 2 DEBUG nova.network.neutron [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:38:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:02.153 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:02.153 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:02.154 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:02 np0005466012 nova_compute[192063]: 2025-10-02 12:38:02.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:03 np0005466012 podman[248067]: 2025-10-02 12:38:03.1729333 +0000 UTC m=+0.078177256 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible)
Oct  2 08:38:03 np0005466012 podman[248066]: 2025-10-02 12:38:03.18409036 +0000 UTC m=+0.087098474 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:38:03 np0005466012 nova_compute[192063]: 2025-10-02 12:38:03.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.101 2 DEBUG nova.network.neutron [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updating instance_info_cache with network_info: [{"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.120 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.120 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Instance network_info: |[{"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.120 2 DEBUG oslo_concurrency.lockutils [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.121 2 DEBUG nova.network.neutron [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Refreshing network info cache for port 440edd3a-154e-4e66-9b54-0719e50ef207 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.125 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Start _get_guest_xml network_info=[{"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.131 2 WARNING nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.136 2 DEBUG nova.virt.libvirt.host [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.137 2 DEBUG nova.virt.libvirt.host [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.144 2 DEBUG nova.virt.libvirt.host [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.144 2 DEBUG nova.virt.libvirt.host [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.146 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.146 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.147 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.147 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.147 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.148 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.148 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.148 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.149 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.149 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.149 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.150 2 DEBUG nova.virt.hardware [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.155 2 DEBUG nova.virt.libvirt.vif [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:37:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-610040079',display_name='tempest-TestGettingAddress-server-610040079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-610040079',id=165,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgra/wXYI+rCsG5upBBPiIDSxRkzAR1A6pFxaSU1LPFfL3D5RfEN0Sz4k+PeFJCJFhU6eEresOI7XeTo6tERj3riWEwLSsbwiPk4PW1j9Dz/nyAQSV9AMMgUGCskGAq1Q==',key_name='tempest-TestGettingAddress-228200713',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-wfibqf3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:59Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=691219f9-8828-4b24-b920-b55514d93e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.155 2 DEBUG nova.network.os_vif_util [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.157 2 DEBUG nova.network.os_vif_util [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.158 2 DEBUG nova.objects.instance [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 691219f9-8828-4b24-b920-b55514d93e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.174 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <uuid>691219f9-8828-4b24-b920-b55514d93e3a</uuid>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <name>instance-000000a5</name>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestGettingAddress-server-610040079</nova:name>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:38:04</nova:creationTime>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        <nova:port uuid="440edd3a-154e-4e66-9b54-0719e50ef207">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe14:9862" ipVersion="6"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe14:9862" ipVersion="6"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <entry name="serial">691219f9-8828-4b24-b920-b55514d93e3a</entry>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <entry name="uuid">691219f9-8828-4b24-b920-b55514d93e3a</entry>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.config"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:14:98:62"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <target dev="tap440edd3a-15"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/console.log" append="off"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:38:04 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:38:04 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:38:04 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:38:04 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.176 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Preparing to wait for external event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.176 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.177 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.177 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.178 2 DEBUG nova.virt.libvirt.vif [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:37:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-610040079',display_name='tempest-TestGettingAddress-server-610040079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-610040079',id=165,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgra/wXYI+rCsG5upBBPiIDSxRkzAR1A6pFxaSU1LPFfL3D5RfEN0Sz4k+PeFJCJFhU6eEresOI7XeTo6tERj3riWEwLSsbwiPk4PW1j9Dz/nyAQSV9AMMgUGCskGAq1Q==',key_name='tempest-TestGettingAddress-228200713',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-wfibqf3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:37:59Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=691219f9-8828-4b24-b920-b55514d93e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.178 2 DEBUG nova.network.os_vif_util [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.179 2 DEBUG nova.network.os_vif_util [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.179 2 DEBUG os_vif [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.180 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.184 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap440edd3a-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.184 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap440edd3a-15, col_values=(('external_ids', {'iface-id': '440edd3a-154e-4e66-9b54-0719e50ef207', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:98:62', 'vm-uuid': '691219f9-8828-4b24-b920-b55514d93e3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 NetworkManager[51207]: <info>  [1759408684.1880] manager: (tap440edd3a-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.193 2 INFO os_vif [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15')#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.237 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.237 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.238 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:14:98:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.238 2 INFO nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Using config drive#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.669 2 INFO nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Creating config drive at /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.config#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.677 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp60toyru2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.804 2 DEBUG oslo_concurrency.processutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp60toyru2" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:38:04 np0005466012 kernel: tap440edd3a-15: entered promiscuous mode
Oct  2 08:38:04 np0005466012 NetworkManager[51207]: <info>  [1759408684.8598] manager: (tap440edd3a-15): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Oct  2 08:38:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:04Z|00671|binding|INFO|Claiming lport 440edd3a-154e-4e66-9b54-0719e50ef207 for this chassis.
Oct  2 08:38:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:04Z|00672|binding|INFO|440edd3a-154e-4e66-9b54-0719e50ef207: Claiming fa:16:3e:14:98:62 10.100.0.3 2001:db8:0:1:f816:3eff:fe14:9862 2001:db8::f816:3eff:fe14:9862
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.871 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:98:62 10.100.0.3 2001:db8:0:1:f816:3eff:fe14:9862 2001:db8::f816:3eff:fe14:9862'], port_security=['fa:16:3e:14:98:62 10.100.0.3 2001:db8:0:1:f816:3eff:fe14:9862 2001:db8::f816:3eff:fe14:9862'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8:0:1:f816:3eff:fe14:9862/64 2001:db8::f816:3eff:fe14:9862/64', 'neutron:device_id': '691219f9-8828-4b24-b920-b55514d93e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '38601fe0-d139-4a59-b46e-238283b5fcdd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=440edd3a-154e-4e66-9b54-0719e50ef207) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.872 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 440edd3a-154e-4e66-9b54-0719e50ef207 in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 bound to our chassis#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.874 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6#033[00m
Oct  2 08:38:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:04Z|00673|binding|INFO|Setting lport 440edd3a-154e-4e66-9b54-0719e50ef207 ovn-installed in OVS
Oct  2 08:38:04 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:04Z|00674|binding|INFO|Setting lport 440edd3a-154e-4e66-9b54-0719e50ef207 up in Southbound
Oct  2 08:38:04 np0005466012 nova_compute[192063]: 2025-10-02 12:38:04.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.883 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d7853f5f-9c83-44e7-818e-ba975756eadc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.884 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d7388dd-d1 in ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.887 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d7388dd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.887 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[753cc776-a6d2-4f7c-a659-05b53ec6fe6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.888 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2d88bc71-3f70-4d62-991e-74b7f9e51b1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:04 np0005466012 systemd-udevd[248125]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.901 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[7c85dcb2-6dcc-4262-8f51-6bfdd8853c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:04 np0005466012 systemd-machined[152114]: New machine qemu-76-instance-000000a5.
Oct  2 08:38:04 np0005466012 NetworkManager[51207]: <info>  [1759408684.9111] device (tap440edd3a-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:38:04 np0005466012 NetworkManager[51207]: <info>  [1759408684.9117] device (tap440edd3a-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:38:04 np0005466012 systemd[1]: Started Virtual Machine qemu-76-instance-000000a5.
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.927 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[174af3b0-8d67-43eb-807b-1b00cd6a24f9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.962 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[09535cf2-bc1e-44a5-b462-ff3fd4624d1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:04 np0005466012 NetworkManager[51207]: <info>  [1759408684.9703] manager: (tap1d7388dd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/317)
Oct  2 08:38:04 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:04.969 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7749ea20-f67f-403b-a687-f9ee50115771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.007 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[161abfd4-5c57-4646-a3cb-771568a474fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.010 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2d732135-e5f8-4e5b-ac90-194ebd0b4a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 NetworkManager[51207]: <info>  [1759408685.0355] device (tap1d7388dd-d0): carrier: link connected
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.039 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[bf3a42d2-1f28-4635-8aeb-dac37395e0ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.057 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1355c5-315a-4c7c-ac5b-bc5c2bac58a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7388dd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 667864, 'reachable_time': 25404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248157, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.072 2 DEBUG nova.compute.manager [req-aad06405-b5fb-4cb2-955c-f7b6b43f8e7f req-528e742d-04c6-497d-ae37-f4729f4243d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.072 2 DEBUG oslo_concurrency.lockutils [req-aad06405-b5fb-4cb2-955c-f7b6b43f8e7f req-528e742d-04c6-497d-ae37-f4729f4243d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.072 2 DEBUG oslo_concurrency.lockutils [req-aad06405-b5fb-4cb2-955c-f7b6b43f8e7f req-528e742d-04c6-497d-ae37-f4729f4243d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.073 2 DEBUG oslo_concurrency.lockutils [req-aad06405-b5fb-4cb2-955c-f7b6b43f8e7f req-528e742d-04c6-497d-ae37-f4729f4243d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.073 2 DEBUG nova.compute.manager [req-aad06405-b5fb-4cb2-955c-f7b6b43f8e7f req-528e742d-04c6-497d-ae37-f4729f4243d7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Processing event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.079 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd176ce-e995-4af1-bb75-566e68f9b6a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefb:624a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 667864, 'tstamp': 667864}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248158, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.094 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fb2c4559-7e32-4a0d-832d-3998fec97152]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7388dd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 667864, 'reachable_time': 25404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248159, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.128 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[66ea04fc-2e5d-4a99-8a0d-60b51c3ff8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.187 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a243c430-46b0-4294-b046-e3b714007c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.188 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7388dd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.189 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.189 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d7388dd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:05 np0005466012 NetworkManager[51207]: <info>  [1759408685.1919] manager: (tap1d7388dd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Oct  2 08:38:05 np0005466012 kernel: tap1d7388dd-d0: entered promiscuous mode
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.195 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d7388dd-d0, col_values=(('external_ids', {'iface-id': 'e78bd1c4-7546-4ebe-a71b-a49e8c78f36c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:05 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:05Z|00675|binding|INFO|Releasing lport e78bd1c4-7546-4ebe-a71b-a49e8c78f36c from this chassis (sb_readonly=0)
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.209 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.209 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[738bf5db-01bd-4386-b5ab-98de054412c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.210 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.pid.haproxy
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.211 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'env', 'PROCESS_TAG=haproxy-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:38:05 np0005466012 podman[248198]: 2025-10-02 12:38:05.598275956 +0000 UTC m=+0.065042251 container create 66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  2 08:38:05 np0005466012 systemd[1]: Started libpod-conmon-66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5.scope.
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.636 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408685.6357217, 691219f9-8828-4b24-b920-b55514d93e3a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.636 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] VM Started (Lifecycle Event)#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.640 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:38:05 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.645 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.648 2 INFO nova.virt.libvirt.driver [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Instance spawned successfully.#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.648 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:38:05 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2fa0d37679a61fa969982cab58b71cff27239354389d6977e9d3e9226d990e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:38:05 np0005466012 podman[248198]: 2025-10-02 12:38:05.664408475 +0000 UTC m=+0.131174800 container init 66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 08:38:05 np0005466012 podman[248198]: 2025-10-02 12:38:05.573793065 +0000 UTC m=+0.040559390 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:38:05 np0005466012 podman[248198]: 2025-10-02 12:38:05.669633901 +0000 UTC m=+0.136400216 container start 66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.674 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.680 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.685 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.686 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.686 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.687 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.687 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.687 2 DEBUG nova.virt.libvirt.driver [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:38:05 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [NOTICE]   (248218) : New worker (248220) forked
Oct  2 08:38:05 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [NOTICE]   (248218) : Loading success.
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.715 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.716 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408685.635805, 691219f9-8828-4b24-b920-b55514d93e3a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.716 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.747 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.751 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408685.643889, 691219f9-8828-4b24-b920-b55514d93e3a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.752 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.777 2 INFO nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Took 6.14 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.778 2 DEBUG nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.780 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.789 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.832 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.879 2 INFO nova.compute.manager [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Took 6.78 seconds to build instance.#033[00m
Oct  2 08:38:05 np0005466012 nova_compute[192063]: 2025-10-02 12:38:05.896 2 DEBUG oslo_concurrency.lockutils [None req-04ac60bc-0d59-453d-bf77-a22d6c151cca 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:05.930 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:07 np0005466012 podman[248230]: 2025-10-02 12:38:07.162832984 +0000 UTC m=+0.063331643 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:38:07 np0005466012 podman[248229]: 2025-10-02 12:38:07.182677866 +0000 UTC m=+0.080193432 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:07 np0005466012 nova_compute[192063]: 2025-10-02 12:38:07.190 2 DEBUG nova.compute.manager [req-063a2a3e-374d-4b07-8c0f-1eb628101bf9 req-f914c7ce-095a-4420-9b19-34ab69b01066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:07 np0005466012 nova_compute[192063]: 2025-10-02 12:38:07.190 2 DEBUG oslo_concurrency.lockutils [req-063a2a3e-374d-4b07-8c0f-1eb628101bf9 req-f914c7ce-095a-4420-9b19-34ab69b01066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:07 np0005466012 nova_compute[192063]: 2025-10-02 12:38:07.191 2 DEBUG oslo_concurrency.lockutils [req-063a2a3e-374d-4b07-8c0f-1eb628101bf9 req-f914c7ce-095a-4420-9b19-34ab69b01066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:07 np0005466012 nova_compute[192063]: 2025-10-02 12:38:07.191 2 DEBUG oslo_concurrency.lockutils [req-063a2a3e-374d-4b07-8c0f-1eb628101bf9 req-f914c7ce-095a-4420-9b19-34ab69b01066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:07 np0005466012 nova_compute[192063]: 2025-10-02 12:38:07.191 2 DEBUG nova.compute.manager [req-063a2a3e-374d-4b07-8c0f-1eb628101bf9 req-f914c7ce-095a-4420-9b19-34ab69b01066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] No waiting events found dispatching network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:07 np0005466012 nova_compute[192063]: 2025-10-02 12:38:07.191 2 WARNING nova.compute.manager [req-063a2a3e-374d-4b07-8c0f-1eb628101bf9 req-f914c7ce-095a-4420-9b19-34ab69b01066 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received unexpected event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:38:08 np0005466012 nova_compute[192063]: 2025-10-02 12:38:08.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:08 np0005466012 nova_compute[192063]: 2025-10-02 12:38:08.100 2 DEBUG nova.network.neutron [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updated VIF entry in instance network info cache for port 440edd3a-154e-4e66-9b54-0719e50ef207. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:08 np0005466012 nova_compute[192063]: 2025-10-02 12:38:08.101 2 DEBUG nova.network.neutron [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updating instance_info_cache with network_info: [{"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:08 np0005466012 nova_compute[192063]: 2025-10-02 12:38:08.115 2 DEBUG oslo_concurrency.lockutils [req-80d42c52-d614-4068-8799-6b18fdec2a09 req-952bfd8b-c367-4c39-a105-982bca950b2d 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:09 np0005466012 nova_compute[192063]: 2025-10-02 12:38:09.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:12 np0005466012 nova_compute[192063]: 2025-10-02 12:38:12.250 2 DEBUG nova.compute.manager [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-changed-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:12 np0005466012 nova_compute[192063]: 2025-10-02 12:38:12.250 2 DEBUG nova.compute.manager [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Refreshing instance network info cache due to event network-changed-440edd3a-154e-4e66-9b54-0719e50ef207. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:12 np0005466012 nova_compute[192063]: 2025-10-02 12:38:12.250 2 DEBUG oslo_concurrency.lockutils [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:12 np0005466012 nova_compute[192063]: 2025-10-02 12:38:12.251 2 DEBUG oslo_concurrency.lockutils [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:12 np0005466012 nova_compute[192063]: 2025-10-02 12:38:12.251 2 DEBUG nova.network.neutron [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Refreshing network info cache for port 440edd3a-154e-4e66-9b54-0719e50ef207 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:13 np0005466012 nova_compute[192063]: 2025-10-02 12:38:13.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:14 np0005466012 nova_compute[192063]: 2025-10-02 12:38:14.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:14 np0005466012 nova_compute[192063]: 2025-10-02 12:38:14.541 2 DEBUG nova.network.neutron [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updated VIF entry in instance network info cache for port 440edd3a-154e-4e66-9b54-0719e50ef207. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:14 np0005466012 nova_compute[192063]: 2025-10-02 12:38:14.542 2 DEBUG nova.network.neutron [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updating instance_info_cache with network_info: [{"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:14 np0005466012 nova_compute[192063]: 2025-10-02 12:38:14.565 2 DEBUG oslo_concurrency.lockutils [req-845d8f16-0927-4954-87f5-a66d964936a9 req-2378ad28-5a1a-4dd3-9c8a-d7823f56e78b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:18 np0005466012 nova_compute[192063]: 2025-10-02 12:38:18.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:18Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:14:98:62 10.100.0.3
Oct  2 08:38:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:18Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:14:98:62 10.100.0.3
Oct  2 08:38:19 np0005466012 nova_compute[192063]: 2025-10-02 12:38:19.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:21 np0005466012 podman[248281]: 2025-10-02 12:38:21.130613588 +0000 UTC m=+0.045354943 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:38:21 np0005466012 podman[248282]: 2025-10-02 12:38:21.168004238 +0000 UTC m=+0.079246026 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:38:21 np0005466012 podman[248332]: 2025-10-02 12:38:21.255555114 +0000 UTC m=+0.062281003 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:38:21 np0005466012 podman[248333]: 2025-10-02 12:38:21.257940491 +0000 UTC m=+0.063194390 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm)
Oct  2 08:38:23 np0005466012 nova_compute[192063]: 2025-10-02 12:38:23.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:24 np0005466012 nova_compute[192063]: 2025-10-02 12:38:24.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:28 np0005466012 nova_compute[192063]: 2025-10-02 12:38:28.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:29 np0005466012 nova_compute[192063]: 2025-10-02 12:38:29.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:30Z|00676|binding|INFO|Releasing lport ad4e7082-9510-41a9-bc81-de2c66402e98 from this chassis (sb_readonly=0)
Oct  2 08:38:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:30Z|00677|binding|INFO|Releasing lport e78bd1c4-7546-4ebe-a71b-a49e8c78f36c from this chassis (sb_readonly=0)
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.532 2 DEBUG nova.compute.manager [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-changed-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.533 2 DEBUG nova.compute.manager [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Refreshing instance network info cache due to event network-changed-440edd3a-154e-4e66-9b54-0719e50ef207. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.533 2 DEBUG oslo_concurrency.lockutils [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.534 2 DEBUG oslo_concurrency.lockutils [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.534 2 DEBUG nova.network.neutron [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Refreshing network info cache for port 440edd3a-154e-4e66-9b54-0719e50ef207 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.619 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.619 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.619 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.619 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.620 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.630 2 INFO nova.compute.manager [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Terminating instance#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.640 2 DEBUG nova.compute.manager [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:38:30 np0005466012 kernel: tap440edd3a-15 (unregistering): left promiscuous mode
Oct  2 08:38:30 np0005466012 NetworkManager[51207]: <info>  [1759408710.6635] device (tap440edd3a-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:38:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:30Z|00678|binding|INFO|Releasing lport 440edd3a-154e-4e66-9b54-0719e50ef207 from this chassis (sb_readonly=0)
Oct  2 08:38:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:30Z|00679|binding|INFO|Setting lport 440edd3a-154e-4e66-9b54-0719e50ef207 down in Southbound
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:30Z|00680|binding|INFO|Removing iface tap440edd3a-15 ovn-installed in OVS
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.676 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:98:62 10.100.0.3 2001:db8:0:1:f816:3eff:fe14:9862 2001:db8::f816:3eff:fe14:9862'], port_security=['fa:16:3e:14:98:62 10.100.0.3 2001:db8:0:1:f816:3eff:fe14:9862 2001:db8::f816:3eff:fe14:9862'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8:0:1:f816:3eff:fe14:9862/64 2001:db8::f816:3eff:fe14:9862/64', 'neutron:device_id': '691219f9-8828-4b24-b920-b55514d93e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '38601fe0-d139-4a59-b46e-238283b5fcdd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=546080ca-391c-439c-be48-88bb942119c9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=440edd3a-154e-4e66-9b54-0719e50ef207) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.678 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 440edd3a-154e-4e66-9b54-0719e50ef207 in datapath 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 unbound from our chassis#033[00m
Oct  2 08:38:30 np0005466012 systemd[1]: Starting dnf makecache...
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.681 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.682 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea4beb3-bf8c-4cb3-a0de-f24d9adcc27f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.683 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 namespace which is not needed anymore#033[00m
Oct  2 08:38:30 np0005466012 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Oct  2 08:38:30 np0005466012 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a5.scope: Consumed 13.518s CPU time.
Oct  2 08:38:30 np0005466012 systemd-machined[152114]: Machine qemu-76-instance-000000a5 terminated.
Oct  2 08:38:30 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [NOTICE]   (248218) : haproxy version is 2.8.14-c23fe91
Oct  2 08:38:30 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [NOTICE]   (248218) : path to executable is /usr/sbin/haproxy
Oct  2 08:38:30 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [WARNING]  (248218) : Exiting Master process...
Oct  2 08:38:30 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [ALERT]    (248218) : Current worker (248220) exited with code 143 (Terminated)
Oct  2 08:38:30 np0005466012 neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6[248214]: [WARNING]  (248218) : All workers exited. Exiting... (0)
Oct  2 08:38:30 np0005466012 systemd[1]: libpod-66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5.scope: Deactivated successfully.
Oct  2 08:38:30 np0005466012 podman[248397]: 2025-10-02 12:38:30.829407183 +0000 UTC m=+0.045025194 container died 66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:38:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5-userdata-shm.mount: Deactivated successfully.
Oct  2 08:38:30 np0005466012 systemd[1]: var-lib-containers-storage-overlay-b2fa0d37679a61fa969982cab58b71cff27239354389d6977e9d3e9226d990e6-merged.mount: Deactivated successfully.
Oct  2 08:38:30 np0005466012 podman[248397]: 2025-10-02 12:38:30.881285787 +0000 UTC m=+0.096903798 container cleanup 66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:38:30 np0005466012 systemd[1]: libpod-conmon-66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5.scope: Deactivated successfully.
Oct  2 08:38:30 np0005466012 dnf[248372]: Metadata cache refreshed recently.
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.913 2 INFO nova.virt.libvirt.driver [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Instance destroyed successfully.#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.913 2 DEBUG nova.objects.instance [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 691219f9-8828-4b24-b920-b55514d93e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:30 np0005466012 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  2 08:38:30 np0005466012 systemd[1]: Finished dnf makecache.
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.944 2 DEBUG nova.virt.libvirt.vif [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:37:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-610040079',display_name='tempest-TestGettingAddress-server-610040079',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-610040079',id=165,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgra/wXYI+rCsG5upBBPiIDSxRkzAR1A6pFxaSU1LPFfL3D5RfEN0Sz4k+PeFJCJFhU6eEresOI7XeTo6tERj3riWEwLSsbwiPk4PW1j9Dz/nyAQSV9AMMgUGCskGAq1Q==',key_name='tempest-TestGettingAddress-228200713',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:38:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-wfibqf3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:38:05Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=691219f9-8828-4b24-b920-b55514d93e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.945 2 DEBUG nova.network.os_vif_util [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:30 np0005466012 podman[248442]: 2025-10-02 12:38:30.945438192 +0000 UTC m=+0.042479833 container remove 66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.947 2 DEBUG nova.network.os_vif_util [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.947 2 DEBUG os_vif [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.950 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap440edd3a-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.951 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0713dbef-e82f-4fa4-9bba-3eb96f8737d7]: (4, ('Thu Oct  2 12:38:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 (66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5)\n66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5\nThu Oct  2 12:38:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 (66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5)\n66bef1452274f08b8e33b8575f9295bd29ed0e08d4cbcc454203a4bf711f8fc5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.953 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[756cb62d-fb8d-4b59-9a28-deca68389844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.954 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7388dd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 kernel: tap1d7388dd-d0: left promiscuous mode
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.998 2 INFO os_vif [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:98:62,bridge_name='br-int',has_traffic_filtering=True,id=440edd3a-154e-4e66-9b54-0719e50ef207,network=Network(1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap440edd3a-15')#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.998 2 INFO nova.virt.libvirt.driver [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Deleting instance files /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a_del#033[00m
Oct  2 08:38:30 np0005466012 nova_compute[192063]: 2025-10-02 12:38:30.999 2 INFO nova.virt.libvirt.driver [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Deletion of /var/lib/nova/instances/691219f9-8828-4b24-b920-b55514d93e3a_del complete#033[00m
Oct  2 08:38:30 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:30.999 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0035b54f-096f-408c-a68c-e9e8c0551be7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.011 2 DEBUG nova.compute.manager [req-3943906a-ecaf-4f52-81bc-199c80eacee3 req-6871f378-480d-4800-9482-50e7a4a63b2a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-vif-unplugged-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.012 2 DEBUG oslo_concurrency.lockutils [req-3943906a-ecaf-4f52-81bc-199c80eacee3 req-6871f378-480d-4800-9482-50e7a4a63b2a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.012 2 DEBUG oslo_concurrency.lockutils [req-3943906a-ecaf-4f52-81bc-199c80eacee3 req-6871f378-480d-4800-9482-50e7a4a63b2a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.012 2 DEBUG oslo_concurrency.lockutils [req-3943906a-ecaf-4f52-81bc-199c80eacee3 req-6871f378-480d-4800-9482-50e7a4a63b2a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.012 2 DEBUG nova.compute.manager [req-3943906a-ecaf-4f52-81bc-199c80eacee3 req-6871f378-480d-4800-9482-50e7a4a63b2a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] No waiting events found dispatching network-vif-unplugged-440edd3a-154e-4e66-9b54-0719e50ef207 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.013 2 DEBUG nova.compute.manager [req-3943906a-ecaf-4f52-81bc-199c80eacee3 req-6871f378-480d-4800-9482-50e7a4a63b2a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-vif-unplugged-440edd3a-154e-4e66-9b54-0719e50ef207 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:31.036 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[79348cb5-d849-43c9-a764-0317a50329ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:31.037 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f6186883-bc49-4668-adc0-7901a6330983]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:31.051 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d3fc08-6926-4883-90aa-c388865ebb2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 667856, 'reachable_time': 44423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248462, 'error': None, 'target': 'ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:31.053 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:38:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:31.053 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[270ac481-e33b-4de1-90a4-9c481c8b7bcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:31 np0005466012 systemd[1]: run-netns-ovnmeta\x2d1d7388dd\x2dd8ef\x2d404d\x2d8bb8\x2d6f3d3ab763b6.mount: Deactivated successfully.
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.068 2 INFO nova.compute.manager [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.069 2 DEBUG oslo.service.loopingcall [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.069 2 DEBUG nova.compute.manager [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:38:31 np0005466012 nova_compute[192063]: 2025-10-02 12:38:31.069 2 DEBUG nova.network.neutron [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.724 2 DEBUG nova.network.neutron [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.741 2 INFO nova.compute.manager [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Took 1.67 seconds to deallocate network for instance.#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.788 2 DEBUG nova.compute.manager [req-4cf52de2-31ff-4018-a80c-3676bbe20d7a req-e42b7ea1-5181-486a-a8ee-8702523cda34 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-vif-deleted-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.827 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.827 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.914 2 DEBUG nova.compute.provider_tree [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.929 2 DEBUG nova.scheduler.client.report [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.949 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:32 np0005466012 nova_compute[192063]: 2025-10-02 12:38:32.969 2 INFO nova.scheduler.client.report [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 691219f9-8828-4b24-b920-b55514d93e3a#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.054 2 DEBUG oslo_concurrency.lockutils [None req-432a9cfa-0b28-4b6c-84f9-84d790ecbb41 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.117 2 DEBUG nova.compute.manager [req-0978041e-1ca6-4025-be16-d24c54093a7b req-3ccc62da-32b0-40c0-8a13-5c34a6117a85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.117 2 DEBUG oslo_concurrency.lockutils [req-0978041e-1ca6-4025-be16-d24c54093a7b req-3ccc62da-32b0-40c0-8a13-5c34a6117a85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "691219f9-8828-4b24-b920-b55514d93e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.118 2 DEBUG oslo_concurrency.lockutils [req-0978041e-1ca6-4025-be16-d24c54093a7b req-3ccc62da-32b0-40c0-8a13-5c34a6117a85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.118 2 DEBUG oslo_concurrency.lockutils [req-0978041e-1ca6-4025-be16-d24c54093a7b req-3ccc62da-32b0-40c0-8a13-5c34a6117a85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "691219f9-8828-4b24-b920-b55514d93e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.118 2 DEBUG nova.compute.manager [req-0978041e-1ca6-4025-be16-d24c54093a7b req-3ccc62da-32b0-40c0-8a13-5c34a6117a85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] No waiting events found dispatching network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.119 2 WARNING nova.compute.manager [req-0978041e-1ca6-4025-be16-d24c54093a7b req-3ccc62da-32b0-40c0-8a13-5c34a6117a85 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Received unexpected event network-vif-plugged-440edd3a-154e-4e66-9b54-0719e50ef207 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.149 2 DEBUG nova.network.neutron [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updated VIF entry in instance network info cache for port 440edd3a-154e-4e66-9b54-0719e50ef207. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.150 2 DEBUG nova.network.neutron [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Updating instance_info_cache with network_info: [{"id": "440edd3a-154e-4e66-9b54-0719e50ef207", "address": "fa:16:3e:14:98:62", "network": {"id": "1d7388dd-d8ef-404d-8bb8-6f3d3ab763b6", "bridge": "br-int", "label": "tempest-network-smoke--1033345898", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9862", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap440edd3a-15", "ovs_interfaceid": "440edd3a-154e-4e66-9b54-0719e50ef207", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:33 np0005466012 nova_compute[192063]: 2025-10-02 12:38:33.179 2 DEBUG oslo_concurrency.lockutils [req-9ad7b486-9a10-4e68-8d6d-e996f3a74bc2 req-0890b6e3-33e4-4a01-a0f9-f68068e6f870 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-691219f9-8828-4b24-b920-b55514d93e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:34 np0005466012 podman[248463]: 2025-10-02 12:38:34.190318787 +0000 UTC m=+0.092500284 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:38:34 np0005466012 podman[248464]: 2025-10-02 12:38:34.191447458 +0000 UTC m=+0.094857849 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible)
Oct  2 08:38:34 np0005466012 nova_compute[192063]: 2025-10-02 12:38:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:34 np0005466012 nova_compute[192063]: 2025-10-02 12:38:34.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:35 np0005466012 nova_compute[192063]: 2025-10-02 12:38:35.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:35 np0005466012 nova_compute[192063]: 2025-10-02 12:38:35.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:38 np0005466012 nova_compute[192063]: 2025-10-02 12:38:38.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:38 np0005466012 podman[248503]: 2025-10-02 12:38:38.158083645 +0000 UTC m=+0.072832927 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:38:38 np0005466012 podman[248502]: 2025-10-02 12:38:38.184653695 +0000 UTC m=+0.092419743 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:38:39 np0005466012 nova_compute[192063]: 2025-10-02 12:38:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:40 np0005466012 nova_compute[192063]: 2025-10-02 12:38:40.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:40 np0005466012 nova_compute[192063]: 2025-10-02 12:38:40.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:42 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:42Z|00681|binding|INFO|Releasing lport ad4e7082-9510-41a9-bc81-de2c66402e98 from this chassis (sb_readonly=0)
Oct  2 08:38:42 np0005466012 nova_compute[192063]: 2025-10-02 12:38:42.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.886 2 DEBUG nova.compute.manager [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-changed-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.887 2 DEBUG nova.compute.manager [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Refreshing instance network info cache due to event network-changed-d184278c-aebf-44ae-b916-815cb5979416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.887 2 DEBUG oslo_concurrency.lockutils [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.888 2 DEBUG oslo_concurrency.lockutils [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.888 2 DEBUG nova.network.neutron [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Refreshing network info cache for port d184278c-aebf-44ae-b916-815cb5979416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.989 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.989 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.990 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.990 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:43 np0005466012 nova_compute[192063]: 2025-10-02 12:38:43.990 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.005 2 INFO nova.compute.manager [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Terminating instance#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.015 2 DEBUG nova.compute.manager [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:38:44 np0005466012 kernel: tapd184278c-ae (unregistering): left promiscuous mode
Oct  2 08:38:44 np0005466012 NetworkManager[51207]: <info>  [1759408724.0491] device (tapd184278c-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:38:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:44Z|00682|binding|INFO|Releasing lport d184278c-aebf-44ae-b916-815cb5979416 from this chassis (sb_readonly=0)
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:44Z|00683|binding|INFO|Setting lport d184278c-aebf-44ae-b916-815cb5979416 down in Southbound
Oct  2 08:38:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:38:44Z|00684|binding|INFO|Removing iface tapd184278c-ae ovn-installed in OVS
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.068 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:94:15 10.100.0.8'], port_security=['fa:16:3e:15:94:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a489cbb2-1400-41b4-9345-18186b74b900', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95da58c1-265e-4dd9-ba00-692853005e46', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7d693b90ba447196796435b74590f6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35c2ff63-16f8-4b9e-8320-2301129fdf30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2037b9ce-d2e9-4c7b-b130-56e2abc95360, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=d184278c-aebf-44ae-b916-815cb5979416) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.069 103246 INFO neutron.agent.ovn.metadata.agent [-] Port d184278c-aebf-44ae-b916-815cb5979416 in datapath 95da58c1-265e-4dd9-ba00-692853005e46 unbound from our chassis#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.070 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 95da58c1-265e-4dd9-ba00-692853005e46, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.071 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff283f9-b40a-4961-807a-63a8959bbced]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.072 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46 namespace which is not needed anymore#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Oct  2 08:38:44 np0005466012 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a1.scope: Consumed 16.791s CPU time.
Oct  2 08:38:44 np0005466012 systemd-machined[152114]: Machine qemu-75-instance-000000a1 terminated.
Oct  2 08:38:44 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [NOTICE]   (247803) : haproxy version is 2.8.14-c23fe91
Oct  2 08:38:44 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [NOTICE]   (247803) : path to executable is /usr/sbin/haproxy
Oct  2 08:38:44 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [WARNING]  (247803) : Exiting Master process...
Oct  2 08:38:44 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [ALERT]    (247803) : Current worker (247808) exited with code 143 (Terminated)
Oct  2 08:38:44 np0005466012 neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46[247778]: [WARNING]  (247803) : All workers exited. Exiting... (0)
Oct  2 08:38:44 np0005466012 systemd[1]: libpod-8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6.scope: Deactivated successfully.
Oct  2 08:38:44 np0005466012 conmon[247778]: conmon 8c60d0375388c85bd972 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6.scope/container/memory.events
Oct  2 08:38:44 np0005466012 podman[248572]: 2025-10-02 12:38:44.203124644 +0000 UTC m=+0.040202239 container died 8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:38:44 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6-userdata-shm.mount: Deactivated successfully.
Oct  2 08:38:44 np0005466012 systemd[1]: var-lib-containers-storage-overlay-6ed19f5596101e6a98c738ba523b6cc8d422e6f63c34dc0d1278cd76ced56b95-merged.mount: Deactivated successfully.
Oct  2 08:38:44 np0005466012 podman[248572]: 2025-10-02 12:38:44.234146087 +0000 UTC m=+0.071223682 container cleanup 8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:38:44 np0005466012 systemd[1]: libpod-conmon-8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6.scope: Deactivated successfully.
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.284 2 INFO nova.virt.libvirt.driver [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Instance destroyed successfully.#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.284 2 DEBUG nova.objects.instance [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lazy-loading 'resources' on Instance uuid a489cbb2-1400-41b4-9345-18186b74b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:38:44 np0005466012 podman[248605]: 2025-10-02 12:38:44.29606707 +0000 UTC m=+0.039862450 container remove 8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.302 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[00849f93-eeb1-4fde-aee5-a0756a8369d3]: (4, ('Thu Oct  2 12:38:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46 (8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6)\n8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6\nThu Oct  2 12:38:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46 (8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6)\n8c60d0375388c85bd972d4c72a7e875a5432ff89645ebabc5594499c03b0cbd6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.303 2 DEBUG nova.virt.libvirt.vif [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1471171623',display_name='tempest-TestSnapshotPattern-server-1471171623',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1471171623',id=161,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzJGGGUE+Xks9+aY5SjFk2n2DGAnXfOBhkbeNeuAVWQ/dQZsUYNFa4aU04DL6V5Ahv7YBoVwhzJt5xloq0NtgboR41kXTeWdHADR0n2ucoHL3yxU4d4gs2dS5flZPM85w==',key_name='tempest-TestSnapshotPattern-331136498',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:37:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7d693b90ba447196796435b74590f6',ramdisk_id='',reservation_id='r-9t63tjfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-1950942920',owner_user_name='tempest-TestSnapshotPattern-1950942920-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:37:46Z,user_data=None,user_id='6d07868c23de4edc9018d8964b43d954',uuid=a489cbb2-1400-41b4-9345-18186b74b900,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.304 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee870c2-260c-49af-a459-7c3bd41bd30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.304 2 DEBUG nova.network.os_vif_util [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Converting VIF {"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.305 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95da58c1-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.305 2 DEBUG nova.network.os_vif_util [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.306 2 DEBUG os_vif [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:38:44 np0005466012 kernel: tap95da58c1-20: left promiscuous mode
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.309 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd184278c-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.325 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6c925494-6974-413c-ad4c-874374eed8c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.327 2 INFO os_vif [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:94:15,bridge_name='br-int',has_traffic_filtering=True,id=d184278c-aebf-44ae-b916-815cb5979416,network=Network(95da58c1-265e-4dd9-ba00-692853005e46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd184278c-ae')#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.328 2 INFO nova.virt.libvirt.driver [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Deleting instance files /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900_del#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.329 2 INFO nova.virt.libvirt.driver [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Deletion of /var/lib/nova/instances/a489cbb2-1400-41b4-9345-18186b74b900_del complete#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.355 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d94eedc6-5d25-4d6f-b51f-83283939a786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.357 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[77b5b64f-f10f-4195-b6bf-3938e3b7efcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.372 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[709bdf5a-549f-4b1e-9034-c0c1b81a30f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 663363, 'reachable_time': 32866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248632, 'error': None, 'target': 'ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 systemd[1]: run-netns-ovnmeta\x2d95da58c1\x2d265e\x2d4dd9\x2dba00\x2d692853005e46.mount: Deactivated successfully.
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.377 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-95da58c1-265e-4dd9-ba00-692853005e46 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.377 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6a0702-d3a9-44cf-b0b5-653298a24768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.410 2 INFO nova.compute.manager [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.410 2 DEBUG oslo.service.loopingcall [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.411 2 DEBUG nova.compute.manager [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.411 2 DEBUG nova.network.neutron [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.547 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:44.548 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:44 np0005466012 nova_compute[192063]: 2025-10-02 12:38:44.992 2 DEBUG nova.network.neutron [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.011 2 INFO nova.compute.manager [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Took 0.60 seconds to deallocate network for instance.#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.073 2 DEBUG nova.compute.manager [req-c6789816-2d5f-446a-853d-0edb8d432904 req-42c9fcb8-c0d9-4761-a647-23f54ba96947 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-vif-deleted-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.080 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.080 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.150 2 DEBUG nova.compute.provider_tree [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.160 2 DEBUG nova.network.neutron [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updated VIF entry in instance network info cache for port d184278c-aebf-44ae-b916-815cb5979416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.160 2 DEBUG nova.network.neutron [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Updating instance_info_cache with network_info: [{"id": "d184278c-aebf-44ae-b916-815cb5979416", "address": "fa:16:3e:15:94:15", "network": {"id": "95da58c1-265e-4dd9-ba00-692853005e46", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-603762842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7d693b90ba447196796435b74590f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd184278c-ae", "ovs_interfaceid": "d184278c-aebf-44ae-b916-815cb5979416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.162 2 DEBUG nova.scheduler.client.report [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.179 2 DEBUG oslo_concurrency.lockutils [req-7df531c6-28c2-404d-9316-2bf799b153e5 req-f06a7a5d-cca4-46d4-8907-71c57d8d55e9 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-a489cbb2-1400-41b4-9345-18186b74b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.181 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.200 2 INFO nova.scheduler.client.report [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Deleted allocations for instance a489cbb2-1400-41b4-9345-18186b74b900#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.274 2 DEBUG oslo_concurrency.lockutils [None req-197fad4a-4333-426d-9afc-91eccd285152 6d07868c23de4edc9018d8964b43d954 8f7d693b90ba447196796435b74590f6 - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.849 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.911 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408710.9105523, 691219f9-8828-4b24-b920-b55514d93e3a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.911 2 INFO nova.compute.manager [-] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.940 2 DEBUG nova.compute.manager [None req-f381ff53-59eb-4fa5-98f5-1366f63b96ae - - - - - -] [instance: 691219f9-8828-4b24-b920-b55514d93e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.972 2 DEBUG nova.compute.manager [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-vif-unplugged-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.973 2 DEBUG oslo_concurrency.lockutils [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.973 2 DEBUG oslo_concurrency.lockutils [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.973 2 DEBUG oslo_concurrency.lockutils [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.973 2 DEBUG nova.compute.manager [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] No waiting events found dispatching network-vif-unplugged-d184278c-aebf-44ae-b916-815cb5979416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.974 2 WARNING nova.compute.manager [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received unexpected event network-vif-unplugged-d184278c-aebf-44ae-b916-815cb5979416 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.974 2 DEBUG nova.compute.manager [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.974 2 DEBUG oslo_concurrency.lockutils [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "a489cbb2-1400-41b4-9345-18186b74b900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.974 2 DEBUG oslo_concurrency.lockutils [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.974 2 DEBUG oslo_concurrency.lockutils [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "a489cbb2-1400-41b4-9345-18186b74b900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.975 2 DEBUG nova.compute.manager [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] No waiting events found dispatching network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:38:45 np0005466012 nova_compute[192063]: 2025-10-02 12:38:45.975 2 WARNING nova.compute.manager [req-c0983450-716f-4073-8a06-4e0a8485336f req-8e612021-c1f9-49c8-bffb-cc76bce919a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Received unexpected event network-vif-plugged-d184278c-aebf-44ae-b916-815cb5979416 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.002 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.003 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5726MB free_disk=73.24264907836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.003 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.004 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.047 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.047 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.064 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.075 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.091 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:38:46 np0005466012 nova_compute[192063]: 2025-10-02 12:38:46.091 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:38:47 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:47.551 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.093 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.093 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.093 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.107 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.825 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:48 np0005466012 nova_compute[192063]: 2025-10-02 12:38:48.826 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:38:49 np0005466012 nova_compute[192063]: 2025-10-02 12:38:49.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:49 np0005466012 nova_compute[192063]: 2025-10-02 12:38:49.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:50 np0005466012 nova_compute[192063]: 2025-10-02 12:38:50.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:50 np0005466012 nova_compute[192063]: 2025-10-02 12:38:50.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:50 np0005466012 nova_compute[192063]: 2025-10-02 12:38:50.819 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:38:52 np0005466012 podman[248641]: 2025-10-02 12:38:52.157742903 +0000 UTC m=+0.058599581 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:38:52 np0005466012 podman[248642]: 2025-10-02 12:38:52.157887077 +0000 UTC m=+0.061292766 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:38:52 np0005466012 podman[248640]: 2025-10-02 12:38:52.160424808 +0000 UTC m=+0.068490367 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:38:52 np0005466012 podman[248643]: 2025-10-02 12:38:52.196592954 +0000 UTC m=+0.100013504 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:38:53 np0005466012 nova_compute[192063]: 2025-10-02 12:38:53.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:54 np0005466012 nova_compute[192063]: 2025-10-02 12:38:54.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:55.254 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8::f816:3eff:fe2b:a121'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad07d234-3bc8-429a-8834-7a9ae3274be2) old=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:55.256 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad07d234-3bc8-429a-8834-7a9ae3274be2 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 updated#033[00m
Oct  2 08:38:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:55.256 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:55.258 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2d26ac55-2420-4811-92f4-74c739f57bbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:38:58 np0005466012 nova_compute[192063]: 2025-10-02 12:38:58.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:59 np0005466012 nova_compute[192063]: 2025-10-02 12:38:59.277 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408724.2751994, a489cbb2-1400-41b4-9345-18186b74b900 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:38:59 np0005466012 nova_compute[192063]: 2025-10-02 12:38:59.277 2 INFO nova.compute.manager [-] [instance: a489cbb2-1400-41b4-9345-18186b74b900] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:38:59 np0005466012 nova_compute[192063]: 2025-10-02 12:38:59.303 2 DEBUG nova.compute.manager [None req-6ae50caa-0583-42ae-80bd-7909bcc3a641 - - - - - -] [instance: a489cbb2-1400-41b4-9345-18186b74b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:38:59 np0005466012 nova_compute[192063]: 2025-10-02 12:38:59.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:38:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:59.574 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8:0:1:f816:3eff:fe2b:a121 2001:db8::f816:3eff:fe2b:a121'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2b:a121/64 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad07d234-3bc8-429a-8834-7a9ae3274be2) old=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8::f816:3eff:fe2b:a121'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:38:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:59.576 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad07d234-3bc8-429a-8834-7a9ae3274be2 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 updated#033[00m
Oct  2 08:38:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:59.577 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:38:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:38:59.578 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f572dcb3-0ea3-42fc-b04e-7bc6d9745c81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:02.154 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:02.154 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:02.155 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:03 np0005466012 nova_compute[192063]: 2025-10-02 12:39:03.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:04 np0005466012 nova_compute[192063]: 2025-10-02 12:39:04.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:05 np0005466012 podman[248731]: 2025-10-02 12:39:05.15629641 +0000 UTC m=+0.070675007 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., distribution-scope=public)
Oct  2 08:39:05 np0005466012 podman[248730]: 2025-10-02 12:39:05.172574523 +0000 UTC m=+0.086646411 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.255 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.256 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.277 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.402 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.403 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.410 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.411 2 INFO nova.compute.claims [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.541 2 DEBUG nova.compute.provider_tree [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.564 2 DEBUG nova.scheduler.client.report [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.595 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.596 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.704 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.705 2 DEBUG nova.network.neutron [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.739 2 INFO nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.882 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.897 2 DEBUG nova.policy [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.989 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.990 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.990 2 INFO nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Creating image(s)#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.991 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "/var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.991 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:05 np0005466012 nova_compute[192063]: 2025-10-02 12:39:05.992 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "/var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.003 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.100 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.101 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.101 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.114 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.169 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.170 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.226 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.228 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.229 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.315 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.317 2 DEBUG nova.virt.disk.api [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.319 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.375 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.377 2 DEBUG nova.virt.disk.api [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Cannot resize image /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.378 2 DEBUG nova.objects.instance [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 78166c3e-137c-497b-bb30-a8825a6d4968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.392 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.392 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Ensure instance console log exists: /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.393 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.393 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:06 np0005466012 nova_compute[192063]: 2025-10-02 12:39:06.393 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:07 np0005466012 nova_compute[192063]: 2025-10-02 12:39:07.242 2 DEBUG nova.network.neutron [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Successfully created port: fe50e34b-11c9-412a-bcca-f81589b7c561 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.052 2 DEBUG nova.network.neutron [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Successfully updated port: fe50e34b-11c9-412a-bcca-f81589b7c561 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.067 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.067 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquired lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.067 2 DEBUG nova.network.neutron [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.158 2 DEBUG nova.compute.manager [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-changed-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.158 2 DEBUG nova.compute.manager [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Refreshing instance network info cache due to event network-changed-fe50e34b-11c9-412a-bcca-f81589b7c561. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.159 2 DEBUG oslo_concurrency.lockutils [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:08 np0005466012 nova_compute[192063]: 2025-10-02 12:39:08.245 2 DEBUG nova.network.neutron [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:39:09 np0005466012 podman[248788]: 2025-10-02 12:39:09.141857835 +0000 UTC m=+0.054711814 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:39:09 np0005466012 podman[248787]: 2025-10-02 12:39:09.141804253 +0000 UTC m=+0.055061363 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.696 2 DEBUG nova.network.neutron [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updating instance_info_cache with network_info: [{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.719 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Releasing lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.720 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Instance network_info: |[{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.720 2 DEBUG oslo_concurrency.lockutils [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.721 2 DEBUG nova.network.neutron [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Refreshing network info cache for port fe50e34b-11c9-412a-bcca-f81589b7c561 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.724 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Start _get_guest_xml network_info=[{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.728 2 WARNING nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.733 2 DEBUG nova.virt.libvirt.host [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.734 2 DEBUG nova.virt.libvirt.host [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.737 2 DEBUG nova.virt.libvirt.host [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.738 2 DEBUG nova.virt.libvirt.host [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.739 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.739 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.740 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.740 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.740 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.741 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.741 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.741 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.741 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.742 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.742 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.742 2 DEBUG nova.virt.hardware [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.746 2 DEBUG nova.virt.libvirt.vif [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:39:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1308610491',display_name='tempest-TestGettingAddress-server-1308610491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1308610491',id=167,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIRceitbRKs6XXbf7kadsFogQdqZxrZp17i/4hhUcf2GwYzZeOwXhP4JbD0KEwVqLcpeLnMlwZ2D2mCuHBJR8KfWga2sLnw7wFx6+c4GjZ3JqHI31K0krXjHmfSh85q/Fw==',key_name='tempest-TestGettingAddress-568346474',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-zz6ab3lb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:05Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=78166c3e-137c-497b-bb30-a8825a6d4968,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.746 2 DEBUG nova.network.os_vif_util [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.747 2 DEBUG nova.network.os_vif_util [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.748 2 DEBUG nova.objects.instance [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 78166c3e-137c-497b-bb30-a8825a6d4968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.762 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <uuid>78166c3e-137c-497b-bb30-a8825a6d4968</uuid>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <name>instance-000000a7</name>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestGettingAddress-server-1308610491</nova:name>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:39:09</nova:creationTime>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:user uuid="97ce9f1898484e0e9a1f7c84a9f0dfe3">tempest-TestGettingAddress-1355720650-project-member</nova:user>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:project uuid="fd801958556f4c8aab047ecdef6b5ee8">tempest-TestGettingAddress-1355720650</nova:project>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        <nova:port uuid="fe50e34b-11c9-412a-bcca-f81589b7c561">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fefe:ee8e" ipVersion="6"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fefe:ee8e" ipVersion="6"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <entry name="serial">78166c3e-137c-497b-bb30-a8825a6d4968</entry>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <entry name="uuid">78166c3e-137c-497b-bb30-a8825a6d4968</entry>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.config"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:fe:ee:8e"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <target dev="tapfe50e34b-11"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/console.log" append="off"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:39:09 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:39:09 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:39:09 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:39:09 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.764 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Preparing to wait for external event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.764 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.764 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.765 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.765 2 DEBUG nova.virt.libvirt.vif [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:39:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1308610491',display_name='tempest-TestGettingAddress-server-1308610491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1308610491',id=167,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIRceitbRKs6XXbf7kadsFogQdqZxrZp17i/4hhUcf2GwYzZeOwXhP4JbD0KEwVqLcpeLnMlwZ2D2mCuHBJR8KfWga2sLnw7wFx6+c4GjZ3JqHI31K0krXjHmfSh85q/Fw==',key_name='tempest-TestGettingAddress-568346474',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-zz6ab3lb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:39:05Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=78166c3e-137c-497b-bb30-a8825a6d4968,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.766 2 DEBUG nova.network.os_vif_util [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.766 2 DEBUG nova.network.os_vif_util [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.766 2 DEBUG os_vif [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.767 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.768 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe50e34b-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe50e34b-11, col_values=(('external_ids', {'iface-id': 'fe50e34b-11c9-412a-bcca-f81589b7c561', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:ee:8e', 'vm-uuid': '78166c3e-137c-497b-bb30-a8825a6d4968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:09 np0005466012 NetworkManager[51207]: <info>  [1759408749.7732] manager: (tapfe50e34b-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.780 2 INFO os_vif [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11')#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.826 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.827 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.827 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] No VIF found with MAC fa:16:3e:fe:ee:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:39:09 np0005466012 nova_compute[192063]: 2025-10-02 12:39:09.827 2 INFO nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Using config drive#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.389 2 INFO nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Creating config drive at /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.config#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.398 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdw_prtxc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.540 2 DEBUG oslo_concurrency.processutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdw_prtxc" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:10 np0005466012 kernel: tapfe50e34b-11: entered promiscuous mode
Oct  2 08:39:10 np0005466012 NetworkManager[51207]: <info>  [1759408750.6074] manager: (tapfe50e34b-11): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Oct  2 08:39:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:10Z|00685|binding|INFO|Claiming lport fe50e34b-11c9-412a-bcca-f81589b7c561 for this chassis.
Oct  2 08:39:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:10Z|00686|binding|INFO|fe50e34b-11c9-412a-bcca-f81589b7c561: Claiming fa:16:3e:fe:ee:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fefe:ee8e 2001:db8::f816:3eff:fefe:ee8e
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.629 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:ee:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fefe:ee8e 2001:db8::f816:3eff:fefe:ee8e'], port_security=['fa:16:3e:fe:ee:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fefe:ee8e 2001:db8::f816:3eff:fefe:ee8e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fefe:ee8e/64 2001:db8::f816:3eff:fefe:ee8e/64', 'neutron:device_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '28c32917-1b1e-49b6-8086-1dbf8a5f4a7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=fe50e34b-11c9-412a-bcca-f81589b7c561) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.631 103246 INFO neutron.agent.ovn.metadata.agent [-] Port fe50e34b-11c9-412a-bcca-f81589b7c561 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 bound to our chassis#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.632 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.651 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c5db7dfa-54e7-42df-9fda-41879c54907a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.653 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9d6d69e-01 in ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:39:10 np0005466012 systemd-machined[152114]: New machine qemu-77-instance-000000a7.
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.656 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9d6d69e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.656 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[abbe85c3-c3af-43e9-843b-4e734014060f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.657 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6751b98a-7e22-4124-bd97-31bca24ec50a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 systemd-udevd[248850]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.669 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[6d61c265-e4c9-4e19-9039-0b1a40c3611e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 NetworkManager[51207]: <info>  [1759408750.6737] device (tapfe50e34b-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:39:10 np0005466012 NetworkManager[51207]: <info>  [1759408750.6749] device (tapfe50e34b-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:39:10 np0005466012 systemd[1]: Started Virtual Machine qemu-77-instance-000000a7.
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:10Z|00687|binding|INFO|Setting lport fe50e34b-11c9-412a-bcca-f81589b7c561 ovn-installed in OVS
Oct  2 08:39:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:10Z|00688|binding|INFO|Setting lport fe50e34b-11c9-412a-bcca-f81589b7c561 up in Southbound
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.710 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7463fb-76fb-48f6-86a4-5f0b027d3821]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.743 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d752f5-b5e0-4bed-b933-c5a97c50d8af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 systemd-udevd[248854]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:39:10 np0005466012 NetworkManager[51207]: <info>  [1759408750.7526] manager: (tapb9d6d69e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.753 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f63499a4-b9e9-473a-8880-1a19ca10c27a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.789 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[2c170d29-f085-4295-98d1-85f98af3b438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.793 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[879d5fb1-f26f-4bed-9583-c90f7bf08052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 NetworkManager[51207]: <info>  [1759408750.8200] device (tapb9d6d69e-00): carrier: link connected
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.826 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[83f65c89-ecab-4b1b-9996-b8a3bbd6d138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.844 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c35a2b6a-d2ca-4199-abe1-5c58002a2f5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9d6d69e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:a1:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674443, 'reachable_time': 39861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248882, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.861 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[af5f1949-cfe8-473e-b906-84cff329963f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:a121'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 674443, 'tstamp': 674443}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248883, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.880 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[470ce198-b03d-472f-a025-cd4d642060fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9d6d69e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:a1:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674443, 'reachable_time': 39861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248884, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.910 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[07de7a5b-13a5-4352-9278-ad3d9af5138d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.978 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[188ab49c-d728-4446-9e51-12c1f4edff95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.979 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9d6d69e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.979 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.980 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9d6d69e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 kernel: tapb9d6d69e-00: entered promiscuous mode
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 NetworkManager[51207]: <info>  [1759408750.9840] manager: (tapb9d6d69e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.984 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9d6d69e-00, col_values=(('external_ids', {'iface-id': 'ad07d234-3bc8-429a-8834-7a9ae3274be2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:10Z|00689|binding|INFO|Releasing lport ad07d234-3bc8-429a-8834-7a9ae3274be2 from this chassis (sb_readonly=0)
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.987 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.988 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2a2969f0-35a8-4bba-88fe-322da3417536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.989 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.pid.haproxy
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:39:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:10.990 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'env', 'PROCESS_TAG=haproxy-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.993 2 DEBUG nova.compute.manager [req-76822efc-97d6-43b0-b58c-ee1623af13b0 req-957eb3c4-e7b9-44d3-abfc-7bcc81baec45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.994 2 DEBUG oslo_concurrency.lockutils [req-76822efc-97d6-43b0-b58c-ee1623af13b0 req-957eb3c4-e7b9-44d3-abfc-7bcc81baec45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.994 2 DEBUG oslo_concurrency.lockutils [req-76822efc-97d6-43b0-b58c-ee1623af13b0 req-957eb3c4-e7b9-44d3-abfc-7bcc81baec45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.995 2 DEBUG oslo_concurrency.lockutils [req-76822efc-97d6-43b0-b58c-ee1623af13b0 req-957eb3c4-e7b9-44d3-abfc-7bcc81baec45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.995 2 DEBUG nova.compute.manager [req-76822efc-97d6-43b0-b58c-ee1623af13b0 req-957eb3c4-e7b9-44d3-abfc-7bcc81baec45 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Processing event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:39:10 np0005466012 nova_compute[192063]: 2025-10-02 12:39:10.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:11 np0005466012 podman[248917]: 2025-10-02 12:39:11.34793887 +0000 UTC m=+0.048299874 container create 3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:39:11 np0005466012 systemd[1]: Started libpod-conmon-3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7.scope.
Oct  2 08:39:11 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:39:11 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb95aa6c4f0257167a3d5a6a7e921a7df0d4422fccc532141f10c06ccf1843b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:39:11 np0005466012 podman[248917]: 2025-10-02 12:39:11.324471758 +0000 UTC m=+0.024832762 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:39:11 np0005466012 podman[248917]: 2025-10-02 12:39:11.421834956 +0000 UTC m=+0.122195960 container init 3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:39:11 np0005466012 podman[248917]: 2025-10-02 12:39:11.428165963 +0000 UTC m=+0.128526967 container start 3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:39:11 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [NOTICE]   (248942) : New worker (248944) forked
Oct  2 08:39:11 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [NOTICE]   (248942) : Loading success.
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.786 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408751.7858305, 78166c3e-137c-497b-bb30-a8825a6d4968 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.786 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] VM Started (Lifecycle Event)#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.789 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.793 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.797 2 INFO nova.virt.libvirt.driver [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Instance spawned successfully.#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.798 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.873 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.877 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.899 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.900 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408751.786002, 78166c3e-137c-497b-bb30-a8825a6d4968 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.900 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.941 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.943 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.944 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.944 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.944 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.945 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.945 2 DEBUG nova.virt.libvirt.driver [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.951 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408751.7933753, 78166c3e-137c-497b-bb30-a8825a6d4968 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.952 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:39:11 np0005466012 nova_compute[192063]: 2025-10-02 12:39:11.998 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.002 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.027 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.053 2 INFO nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Took 6.06 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.057 2 DEBUG nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.059 2 DEBUG nova.network.neutron [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updated VIF entry in instance network info cache for port fe50e34b-11c9-412a-bcca-f81589b7c561. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.059 2 DEBUG nova.network.neutron [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updating instance_info_cache with network_info: [{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.085 2 DEBUG oslo_concurrency.lockutils [req-6b9ffe1b-a061-4747-927b-b1f0b23d44b1 req-e015e876-bd23-481d-9595-6464984e7f18 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.167 2 INFO nova.compute.manager [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Took 6.81 seconds to build instance.#033[00m
Oct  2 08:39:12 np0005466012 nova_compute[192063]: 2025-10-02 12:39:12.196 2 DEBUG oslo_concurrency.lockutils [None req-c881ff57-76fa-46fd-b267-be4409147915 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.069 2 DEBUG nova.compute.manager [req-f4b974f3-0a7d-4468-93e2-0d427726be8e req-84c26a57-a520-4f71-b8f8-de76db1951c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.070 2 DEBUG oslo_concurrency.lockutils [req-f4b974f3-0a7d-4468-93e2-0d427726be8e req-84c26a57-a520-4f71-b8f8-de76db1951c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.070 2 DEBUG oslo_concurrency.lockutils [req-f4b974f3-0a7d-4468-93e2-0d427726be8e req-84c26a57-a520-4f71-b8f8-de76db1951c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.070 2 DEBUG oslo_concurrency.lockutils [req-f4b974f3-0a7d-4468-93e2-0d427726be8e req-84c26a57-a520-4f71-b8f8-de76db1951c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.071 2 DEBUG nova.compute.manager [req-f4b974f3-0a7d-4468-93e2-0d427726be8e req-84c26a57-a520-4f71-b8f8-de76db1951c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] No waiting events found dispatching network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:39:13 np0005466012 nova_compute[192063]: 2025-10-02 12:39:13.071 2 WARNING nova.compute.manager [req-f4b974f3-0a7d-4468-93e2-0d427726be8e req-84c26a57-a520-4f71-b8f8-de76db1951c6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received unexpected event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:39:14 np0005466012 nova_compute[192063]: 2025-10-02 12:39:14.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.929 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'name': 'tempest-TestGettingAddress-server-1308610491', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a7', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'hostId': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.945 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.946 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80107bfe-e4a7-4572-b7c9-43c08baec328', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:16.930594', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf3ac800-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.606976699, 'message_signature': '7956883dbbc0abad5bc1202aa40443c1148390ab7581536c16c95ff57893d87b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:16.930594', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf3ae038-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.606976699, 'message_signature': 'f5609664e89add78fe1a1e268be1dd52a4a2fb386201d8a7d03478da67f9258c'}]}, 'timestamp': '2025-10-02 12:39:16.946816', '_unique_id': '2433a18fa9564188a287a2a95ca487ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.948 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.950 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.950 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>]
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.951 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.975 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.976 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 78166c3e-137c-497b-bb30-a8825a6d4968: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.978 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 78166c3e-137c-497b-bb30-a8825a6d4968 / tapfe50e34b-11 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.979 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc7a2c6a-c2a0-4a7c-b2e9-e3dc98418e27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:16.976302', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf3fdf7a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': '389afadc975b9924c803f1a9fad741a611427ac10ddf524dc8df1c9fbea0cc4b'}]}, 'timestamp': '2025-10-02 12:39:16.979490', '_unique_id': '2fda505a2e9e428b9c440bf3045af59d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.980 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.981 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.981 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/cpu volume: 5010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b670da7a-9638-4f52-a81d-6a25e1c80ff2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5010000000, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'timestamp': '2025-10-02T12:39:16.981418', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cf4038f8-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.652066874, 'message_signature': '9a029c3643276101f144e2fd5076c424609ce677ae8ec552356e4e95f99184ea'}]}, 'timestamp': '2025-10-02 12:39:16.981746', '_unique_id': '5ebd8786a1194c269b530218e06552ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.983 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbc0aa94-8d56-4a3c-9ec8-81f507958f5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:16.983277', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf408196-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': '39821836a1e2c89a8e22ac71dedff2226d28df9875be04f66eb6fe59bec6d774'}]}, 'timestamp': '2025-10-02 12:39:16.983589', '_unique_id': 'e310a6bd571642f59e76ad94b660dd6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:16.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.013 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.013 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cf13c05-4bdc-4e50-94e4-abe1bcf7a4ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:16.985184', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf45179c-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '2536028d8854f5d52c841a2a0769166577572e6fdfe210e3cf5749c3c9b8f1bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:16.985184', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf45284a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': 'ea6630df6dd05e62167dd167df136119a39abc5e6631cf25b125611dd049fc17'}]}, 'timestamp': '2025-10-02 12:39:17.014107', '_unique_id': '74cae2f6659a49a895e845b85e43960c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.014 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.015 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbbd207b-b7bc-4de2-8632-456824dcc257', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.015782', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf4579da-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': 'bf9729bbafa5776827b271a1efb7f825560866fbf54d3604c7e112fdce3a9004'}]}, 'timestamp': '2025-10-02 12:39:17.016224', '_unique_id': 'e00a5f53dc5248179b279180fadf04e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.018 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa05f8cf-6b59-4861-aa71-7e8b1fbf4f76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.018297', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf45dbe6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': 'f7d670455e92bcacc5fdc74944a5e1aae923c77e5ed9a43926ad547f5dcf2302'}]}, 'timestamp': '2025-10-02 12:39:17.018764', '_unique_id': '0ada6a0d37d0480eb3c0dedd7926d350'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.019 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.020 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.021 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>]
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.021 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4457de22-6ad5-4616-8a18-65507dd6da52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.021382', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf4655c6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': 'ec1122e2209f3dbb5426f12280bcf19840af68c320feb136a0b655a2f6a70b84'}]}, 'timestamp': '2025-10-02 12:39:17.021877', '_unique_id': 'bfad35ba99c444b6a83d0af6bd27f476'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.023 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.read.latency volume: 369950711 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.024 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.read.latency volume: 628748 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feb92fdc-85be-4a90-baf4-4dfafbfc01d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 369950711, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.023897', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf46b656-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': 'e50e33519396fa6cd6e272436790ad0892d19556942135883dfe3ca56e659e2c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 628748, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.023897', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf46c5a6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': 'c6fb29adcc72ec8526dd441793b4835444067e6c836c7a65a1e489277cbb3593'}]}, 'timestamp': '2025-10-02 12:39:17.024716', '_unique_id': 'ae8c9796de7141a997861278aef8ac67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.025 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.026 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.027 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c09c88b7-e8e9-4059-b0cc-3508c490a625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.026791', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf472794-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '8ba87005bfb1760dd1454f44ddb034e206baf4ddedfe7d19a1d684209cf132be'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.026791', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf4736b2-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '069302c933b3dc58b0e040948d635004b8203d07734731c5402151f0baf7fc37'}]}, 'timestamp': '2025-10-02 12:39:17.027593', '_unique_id': '47d1042cfc8849ffa3c45ea6d37fe437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.028 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.029 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.029 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.030 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae363244-8737-40df-8494-d9022146400e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.029823', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf479ed6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '9c003cb26a2ee2b547cd004e8f32a6cecf017d0487cfe999becd1d52d2cbccf5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.029823', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf47ae44-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '79ee211f416603831ec887ed42692d684ac9c87a2ba63aad4bc3c1b540b18156'}]}, 'timestamp': '2025-10-02 12:39:17.030650', '_unique_id': 'a09c0098013c41cfb4ed8508e0c1f3ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.031 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.033 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.033 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a6ac693-8a82-488e-8d1b-c4d11310abbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.033066', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf481d66-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.606976699, 'message_signature': '28d97aaa2ad5eb09af1fd56a3655aab10dcc7e0f9d126da012bb0a0a2fe08015'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.033066', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf482dba-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.606976699, 'message_signature': 'e76c6663a6a68718dacf235b731a2d106f0a84d5f68514e6dffeb66207a0fbd7'}]}, 'timestamp': '2025-10-02 12:39:17.033915', '_unique_id': 'c306752b3c0445dcaaefff1da8505518'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.034 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.036 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.036 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>]
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.036 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cce6b295-3849-4dde-863c-f4d254c7f61c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.036541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf48a59c-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': '406f4c4fe01a3b235c68e6204dba3b27e428736bc338cafbdd3450deca884f63'}]}, 'timestamp': '2025-10-02 12:39:17.037004', '_unique_id': '2e6baadd1cdf4f21af5bf1a77690fb2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.037 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.039 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48345b33-48fe-47d3-aea4-54f604e2b861', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.039047', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf49065e-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': 'cafec3d56b034ef2722a8b7c5fa3a799a566806146e48941186feb0c41593de7'}]}, 'timestamp': '2025-10-02 12:39:17.039483', '_unique_id': '0bf1cc02cf714659b6e820325a33f6f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.040 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.041 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.041 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1308610491>]
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.042 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.042 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6624afc7-40fc-45f7-9059-2417e9794fc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.042185', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf498174-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.606976699, 'message_signature': '7e815676adaf068cd2f2c666b86da043cc8b16d75eb97579470c3c7c7e716b18'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.042185', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf499722-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.606976699, 'message_signature': '606e2e776066e16ed13e7632c469205c71113ebd8aaf799398594ab6c4a043bc'}]}, 'timestamp': '2025-10-02 12:39:17.043172', '_unique_id': 'ead4833f5a8d404da4237a91dcf58fda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.044 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.045 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.045 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd79aaa1e-fd03-4dea-bd37-24454c77e8cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.045424', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf49ffc8-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '4bb0b50c8bf4f8a3edf4af6f0002d50da56bcd9d8806912d116cfe5ded02e0a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.045424', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf4a103a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': '4e8b965aa1f257d159278e3c5ce2a4213ac6a75af7a3b1559139cb4805e6820c'}]}, 'timestamp': '2025-10-02 12:39:17.046266', '_unique_id': 'e6e00b75b6d4472eb00ad64a7c389ebd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.047 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.048 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e18520dc-d2ce-4bea-97fc-c47c707060cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.048350', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf4a720a-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': 'fddc3080df99547d943b474456d18b40ae9e67873a05095af5ec9b5f34849aff'}]}, 'timestamp': '2025-10-02 12:39:17.048817', '_unique_id': 'fc5a962a814b4011837a0b9c1b36299e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.049 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.051 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '285261e5-1247-4df3-87ce-082f1fe5fafd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.051102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf4add58-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': '444072dda9f4e19006ecee13e384dcdbb12c598112f2a2c766f47d50ae428528'}]}, 'timestamp': '2025-10-02 12:39:17.051547', '_unique_id': '82b9ec4136e74a8a9b3957238290c682'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.052 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.053 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.053 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.054 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44c0e6d2-f006-4e35-b5a6-0f5839f781c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-vda', 'timestamp': '2025-10-02T12:39:17.053604', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cf4b4068-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': 'b5fd270947a25ee11407b607be4283a26c797572ac26f41328357b45fe7f2c14'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': '78166c3e-137c-497b-bb30-a8825a6d4968-sda', 'timestamp': '2025-10-02T12:39:17.053604', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'instance-000000a7', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cf4b4fd6-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.661551197, 'message_signature': 'fd1e56cef774e07c4342d98b2b95d75b4475febc6d966fa8cf2bb5a4d16af0f3'}]}, 'timestamp': '2025-10-02 12:39:17.054449', '_unique_id': 'd370b2d47dd642d3976fc7fc99e95822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.055 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.056 12 DEBUG ceilometer.compute.pollsters [-] 78166c3e-137c-497b-bb30-a8825a6d4968/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08cb01d3-428a-4f13-8bf7-9386c668e717', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ce9f1898484e0e9a1f7c84a9f0dfe3', 'user_name': None, 'project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'project_name': None, 'resource_id': 'instance-000000a7-78166c3e-137c-497b-bb30-a8825a6d4968-tapfe50e34b-11', 'timestamp': '2025-10-02T12:39:17.056498', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1308610491', 'name': 'tapfe50e34b-11', 'instance_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'instance_type': 'm1.nano', 'host': 'f631367acece05b8e2eca60f2f5b12a0adbee2370870f40c27ea0062', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:ee:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe50e34b-11'}, 'message_id': 'cf4bb0c0-9f8c-11f0-b6ee-fa163e01ba27', 'monotonic_time': 6750.65266735, 'message_signature': '1c7e6450c980f53a94d264a4ed4af1aa5ea19f4c582bbaf10bcbe9509638daab'}]}, 'timestamp': '2025-10-02 12:39:17.056949', '_unique_id': '9c378224682c462493c25dbfd5b95b6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:39:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:39:17.058 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:39:17 np0005466012 NetworkManager[51207]: <info>  [1759408757.1629] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Oct  2 08:39:17 np0005466012 NetworkManager[51207]: <info>  [1759408757.1647] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:17Z|00690|binding|INFO|Releasing lport ad07d234-3bc8-429a-8834-7a9ae3274be2 from this chassis (sb_readonly=0)
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.480 2 DEBUG nova.compute.manager [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-changed-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.481 2 DEBUG nova.compute.manager [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Refreshing instance network info cache due to event network-changed-fe50e34b-11c9-412a-bcca-f81589b7c561. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.481 2 DEBUG oslo_concurrency.lockutils [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.481 2 DEBUG oslo_concurrency.lockutils [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:17 np0005466012 nova_compute[192063]: 2025-10-02 12:39:17.482 2 DEBUG nova.network.neutron [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Refreshing network info cache for port fe50e34b-11c9-412a-bcca-f81589b7c561 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:39:18 np0005466012 nova_compute[192063]: 2025-10-02 12:39:18.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:19 np0005466012 nova_compute[192063]: 2025-10-02 12:39:19.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:21 np0005466012 nova_compute[192063]: 2025-10-02 12:39:21.183 2 DEBUG nova.network.neutron [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updated VIF entry in instance network info cache for port fe50e34b-11c9-412a-bcca-f81589b7c561. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:39:21 np0005466012 nova_compute[192063]: 2025-10-02 12:39:21.184 2 DEBUG nova.network.neutron [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updating instance_info_cache with network_info: [{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:21 np0005466012 nova_compute[192063]: 2025-10-02 12:39:21.201 2 DEBUG oslo_concurrency.lockutils [req-e266dce9-fb6f-4814-ac52-8b4d6ba6cf77 req-792be387-232b-4ba4-bf35-99b157da4308 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:23 np0005466012 nova_compute[192063]: 2025-10-02 12:39:23.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:23 np0005466012 podman[248971]: 2025-10-02 12:39:23.159856444 +0000 UTC m=+0.059391043 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:39:23 np0005466012 podman[248969]: 2025-10-02 12:39:23.163745643 +0000 UTC m=+0.071104110 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:39:23 np0005466012 podman[248970]: 2025-10-02 12:39:23.167661641 +0000 UTC m=+0.071330345 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:39:23 np0005466012 podman[248972]: 2025-10-02 12:39:23.197774549 +0000 UTC m=+0.093778949 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:39:23 np0005466012 nova_compute[192063]: 2025-10-02 12:39:23.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:23.502 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:39:23 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:23.504 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:39:23 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:23Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fe:ee:8e 10.100.0.9
Oct  2 08:39:23 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:23Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fe:ee:8e 10.100.0.9
Oct  2 08:39:24 np0005466012 nova_compute[192063]: 2025-10-02 12:39:24.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:39:26.507 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:39:28 np0005466012 nova_compute[192063]: 2025-10-02 12:39:28.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:29 np0005466012 nova_compute[192063]: 2025-10-02 12:39:29.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:30 np0005466012 ovn_controller[94284]: 2025-10-02T12:39:30Z|00691|binding|INFO|Releasing lport ad07d234-3bc8-429a-8834-7a9ae3274be2 from this chassis (sb_readonly=0)
Oct  2 08:39:30 np0005466012 nova_compute[192063]: 2025-10-02 12:39:30.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:32 np0005466012 nova_compute[192063]: 2025-10-02 12:39:32.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:33 np0005466012 nova_compute[192063]: 2025-10-02 12:39:33.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:34 np0005466012 nova_compute[192063]: 2025-10-02 12:39:34.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:34 np0005466012 nova_compute[192063]: 2025-10-02 12:39:34.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:36 np0005466012 podman[249054]: 2025-10-02 12:39:36.165867318 +0000 UTC m=+0.078305179 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:39:36 np0005466012 podman[249055]: 2025-10-02 12:39:36.189074894 +0000 UTC m=+0.086492647 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:39:36 np0005466012 nova_compute[192063]: 2025-10-02 12:39:36.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:38 np0005466012 nova_compute[192063]: 2025-10-02 12:39:38.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:38 np0005466012 nova_compute[192063]: 2025-10-02 12:39:38.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:39 np0005466012 nova_compute[192063]: 2025-10-02 12:39:39.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:39 np0005466012 nova_compute[192063]: 2025-10-02 12:39:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:40 np0005466012 podman[249096]: 2025-10-02 12:39:40.146531574 +0000 UTC m=+0.051872158 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:39:40 np0005466012 podman[249095]: 2025-10-02 12:39:40.16706102 +0000 UTC m=+0.066997722 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:39:43 np0005466012 nova_compute[192063]: 2025-10-02 12:39:43.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:43 np0005466012 nova_compute[192063]: 2025-10-02 12:39:43.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:44 np0005466012 nova_compute[192063]: 2025-10-02 12:39:44.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:45 np0005466012 nova_compute[192063]: 2025-10-02 12:39:45.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:46 np0005466012 nova_compute[192063]: 2025-10-02 12:39:46.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.518 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.518 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.518 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.519 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.719 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.775 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.776 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.832 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.975 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.976 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5555MB free_disk=73.21363067626953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.977 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:39:48 np0005466012 nova_compute[192063]: 2025-10-02 12:39:48.977 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.069 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 78166c3e-137c-497b-bb30-a8825a6d4968 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.070 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.070 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.125 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.140 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.164 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.164 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.165 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.165 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:39:49 np0005466012 nova_compute[192063]: 2025-10-02 12:39:49.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.179 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.180 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.180 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.483 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.484 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.484 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:39:51 np0005466012 nova_compute[192063]: 2025-10-02 12:39:51.484 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 78166c3e-137c-497b-bb30-a8825a6d4968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:39:53 np0005466012 nova_compute[192063]: 2025-10-02 12:39:53.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:54 np0005466012 podman[249146]: 2025-10-02 12:39:54.148594701 +0000 UTC m=+0.063322179 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:39:54 np0005466012 podman[249147]: 2025-10-02 12:39:54.180502178 +0000 UTC m=+0.080516483 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:39:54 np0005466012 podman[249148]: 2025-10-02 12:39:54.181150776 +0000 UTC m=+0.090122033 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:39:54 np0005466012 podman[249145]: 2025-10-02 12:39:54.207819196 +0000 UTC m=+0.113883151 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:39:54 np0005466012 nova_compute[192063]: 2025-10-02 12:39:54.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:55 np0005466012 nova_compute[192063]: 2025-10-02 12:39:55.388 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updating instance_info_cache with network_info: [{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:39:56 np0005466012 nova_compute[192063]: 2025-10-02 12:39:56.588 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:39:56 np0005466012 nova_compute[192063]: 2025-10-02 12:39:56.588 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:39:56 np0005466012 nova_compute[192063]: 2025-10-02 12:39:56.589 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:56 np0005466012 nova_compute[192063]: 2025-10-02 12:39:56.589 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:39:56 np0005466012 nova_compute[192063]: 2025-10-02 12:39:56.590 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:39:58 np0005466012 nova_compute[192063]: 2025-10-02 12:39:58.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:39:59 np0005466012 nova_compute[192063]: 2025-10-02 12:39:59.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:02.155 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:02.156 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:02.157 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:03 np0005466012 nova_compute[192063]: 2025-10-02 12:40:03.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:04 np0005466012 nova_compute[192063]: 2025-10-02 12:40:04.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:05.570 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:05 np0005466012 nova_compute[192063]: 2025-10-02 12:40:05.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:05.571 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:40:07 np0005466012 podman[249233]: 2025-10-02 12:40:07.13782298 +0000 UTC m=+0.054585524 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:40:07 np0005466012 podman[249234]: 2025-10-02 12:40:07.169608723 +0000 UTC m=+0.080026959 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:40:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:07.572 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:08 np0005466012 nova_compute[192063]: 2025-10-02 12:40:08.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:09 np0005466012 nova_compute[192063]: 2025-10-02 12:40:09.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:11 np0005466012 podman[249274]: 2025-10-02 12:40:11.147886198 +0000 UTC m=+0.058350530 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  2 08:40:11 np0005466012 podman[249275]: 2025-10-02 12:40:11.161337925 +0000 UTC m=+0.061458227 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:40:12 np0005466012 nova_compute[192063]: 2025-10-02 12:40:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:12 np0005466012 nova_compute[192063]: 2025-10-02 12:40:12.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:40:12 np0005466012 nova_compute[192063]: 2025-10-02 12:40:12.927 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:40:13 np0005466012 nova_compute[192063]: 2025-10-02 12:40:13.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:14 np0005466012 nova_compute[192063]: 2025-10-02 12:40:14.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:18 np0005466012 nova_compute[192063]: 2025-10-02 12:40:18.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:19 np0005466012 nova_compute[192063]: 2025-10-02 12:40:19.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.425 2 DEBUG nova.compute.manager [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-changed-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.426 2 DEBUG nova.compute.manager [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Refreshing instance network info cache due to event network-changed-fe50e34b-11c9-412a-bcca-f81589b7c561. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.426 2 DEBUG oslo_concurrency.lockutils [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.427 2 DEBUG oslo_concurrency.lockutils [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.427 2 DEBUG nova.network.neutron [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Refreshing network info cache for port fe50e34b-11c9-412a-bcca-f81589b7c561 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.562 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.563 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.563 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.564 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.564 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.590 2 INFO nova.compute.manager [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Terminating instance#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.611 2 DEBUG nova.compute.manager [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:40:22 np0005466012 kernel: tapfe50e34b-11 (unregistering): left promiscuous mode
Oct  2 08:40:22 np0005466012 NetworkManager[51207]: <info>  [1759408822.6430] device (tapfe50e34b-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 ovn_controller[94284]: 2025-10-02T12:40:22Z|00692|binding|INFO|Releasing lport fe50e34b-11c9-412a-bcca-f81589b7c561 from this chassis (sb_readonly=0)
Oct  2 08:40:22 np0005466012 ovn_controller[94284]: 2025-10-02T12:40:22Z|00693|binding|INFO|Setting lport fe50e34b-11c9-412a-bcca-f81589b7c561 down in Southbound
Oct  2 08:40:22 np0005466012 ovn_controller[94284]: 2025-10-02T12:40:22Z|00694|binding|INFO|Removing iface tapfe50e34b-11 ovn-installed in OVS
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:22.673 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:ee:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fefe:ee8e 2001:db8::f816:3eff:fefe:ee8e'], port_security=['fa:16:3e:fe:ee:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fefe:ee8e 2001:db8::f816:3eff:fefe:ee8e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fefe:ee8e/64 2001:db8::f816:3eff:fefe:ee8e/64', 'neutron:device_id': '78166c3e-137c-497b-bb30-a8825a6d4968', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '28c32917-1b1e-49b6-8086-1dbf8a5f4a7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=fe50e34b-11c9-412a-bcca-f81589b7c561) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:22.675 103246 INFO neutron.agent.ovn.metadata.agent [-] Port fe50e34b-11c9-412a-bcca-f81589b7c561 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 unbound from our chassis#033[00m
Oct  2 08:40:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:22.678 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:22.679 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[47ca5661-0403-4ff3-8370-871050076946]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:22 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:22.680 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 namespace which is not needed anymore#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Oct  2 08:40:22 np0005466012 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a7.scope: Consumed 16.095s CPU time.
Oct  2 08:40:22 np0005466012 systemd-machined[152114]: Machine qemu-77-instance-000000a7 terminated.
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.888 2 INFO nova.virt.libvirt.driver [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Instance destroyed successfully.#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.888 2 DEBUG nova.objects.instance [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lazy-loading 'resources' on Instance uuid 78166c3e-137c-497b-bb30-a8825a6d4968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.913 2 DEBUG nova.virt.libvirt.vif [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:39:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1308610491',display_name='tempest-TestGettingAddress-server-1308610491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1308610491',id=167,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIRceitbRKs6XXbf7kadsFogQdqZxrZp17i/4hhUcf2GwYzZeOwXhP4JbD0KEwVqLcpeLnMlwZ2D2mCuHBJR8KfWga2sLnw7wFx6+c4GjZ3JqHI31K0krXjHmfSh85q/Fw==',key_name='tempest-TestGettingAddress-568346474',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:39:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd801958556f4c8aab047ecdef6b5ee8',ramdisk_id='',reservation_id='r-zz6ab3lb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1355720650',owner_user_name='tempest-TestGettingAddress-1355720650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:39:12Z,user_data=None,user_id='97ce9f1898484e0e9a1f7c84a9f0dfe3',uuid=78166c3e-137c-497b-bb30-a8825a6d4968,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.914 2 DEBUG nova.network.os_vif_util [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converting VIF {"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.915 2 DEBUG nova.network.os_vif_util [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.916 2 DEBUG os_vif [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.919 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe50e34b-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.927 2 INFO os_vif [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:ee:8e,bridge_name='br-int',has_traffic_filtering=True,id=fe50e34b-11c9-412a-bcca-f81589b7c561,network=Network(b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe50e34b-11')#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.928 2 INFO nova.virt.libvirt.driver [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Deleting instance files /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968_del#033[00m
Oct  2 08:40:22 np0005466012 nova_compute[192063]: 2025-10-02 12:40:22.929 2 INFO nova.virt.libvirt.driver [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Deletion of /var/lib/nova/instances/78166c3e-137c-497b-bb30-a8825a6d4968_del complete#033[00m
Oct  2 08:40:23 np0005466012 nova_compute[192063]: 2025-10-02 12:40:23.046 2 INFO nova.compute.manager [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:40:23 np0005466012 nova_compute[192063]: 2025-10-02 12:40:23.046 2 DEBUG oslo.service.loopingcall [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:40:23 np0005466012 nova_compute[192063]: 2025-10-02 12:40:23.047 2 DEBUG nova.compute.manager [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:40:23 np0005466012 nova_compute[192063]: 2025-10-02 12:40:23.047 2 DEBUG nova.network.neutron [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:40:23 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [NOTICE]   (248942) : haproxy version is 2.8.14-c23fe91
Oct  2 08:40:23 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [NOTICE]   (248942) : path to executable is /usr/sbin/haproxy
Oct  2 08:40:23 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [WARNING]  (248942) : Exiting Master process...
Oct  2 08:40:23 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [WARNING]  (248942) : Exiting Master process...
Oct  2 08:40:23 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [ALERT]    (248942) : Current worker (248944) exited with code 143 (Terminated)
Oct  2 08:40:23 np0005466012 neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177[248938]: [WARNING]  (248942) : All workers exited. Exiting... (0)
Oct  2 08:40:23 np0005466012 systemd[1]: libpod-3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7.scope: Deactivated successfully.
Oct  2 08:40:23 np0005466012 podman[249340]: 2025-10-02 12:40:23.147441154 +0000 UTC m=+0.349908400 container died 3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:40:23 np0005466012 nova_compute[192063]: 2025-10-02 12:40:23.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:23 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7-userdata-shm.mount: Deactivated successfully.
Oct  2 08:40:23 np0005466012 systemd[1]: var-lib-containers-storage-overlay-eb95aa6c4f0257167a3d5a6a7e921a7df0d4422fccc532141f10c06ccf1843b0-merged.mount: Deactivated successfully.
Oct  2 08:40:24 np0005466012 podman[249340]: 2025-10-02 12:40:24.119246633 +0000 UTC m=+1.321713869 container cleanup 3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:40:24 np0005466012 systemd[1]: libpod-conmon-3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7.scope: Deactivated successfully.
Oct  2 08:40:24 np0005466012 podman[249389]: 2025-10-02 12:40:24.64917847 +0000 UTC m=+0.489095201 container remove 3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.657 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[83dc308a-8312-4a46-9bfd-5096f8fbfaec]: (4, ('Thu Oct  2 12:40:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 (3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7)\n3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7\nThu Oct  2 12:40:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 (3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7)\n3f1429da519e231ea8f0870a0ccbe155f7bed0ec4ce4fc99300773cfdcd9ede7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.659 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[90688438-a643-4903-a8af-93ca9cb53068]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.660 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9d6d69e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:40:24 np0005466012 nova_compute[192063]: 2025-10-02 12:40:24.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:24 np0005466012 kernel: tapb9d6d69e-00: left promiscuous mode
Oct  2 08:40:24 np0005466012 nova_compute[192063]: 2025-10-02 12:40:24.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.678 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[342bb640-529f-4871-83a6-40b9919198cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.704 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[704b4142-5e3d-4c73-8792-718eb46cce23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.706 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb96548-9483-495b-84ce-a247123e92fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.729 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[041a2282-4d47-47f4-9ecd-aa0e94c13f31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674434, 'reachable_time': 40015, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249425, 'error': None, 'target': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 systemd[1]: run-netns-ovnmeta\x2db9d6d69e\x2d0327\x2d4bcf\x2db8a6\x2db2cf69a4d177.mount: Deactivated successfully.
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.733 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:40:24 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:24.734 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[7750f364-6bce-4d7d-86bc-674b1b548bcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:24 np0005466012 podman[249406]: 2025-10-02 12:40:24.771699762 +0000 UTC m=+0.068442894 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:40:24 np0005466012 podman[249404]: 2025-10-02 12:40:24.781206699 +0000 UTC m=+0.078068945 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct  2 08:40:24 np0005466012 podman[249408]: 2025-10-02 12:40:24.808747782 +0000 UTC m=+0.101645256 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:40:24 np0005466012 podman[249407]: 2025-10-02 12:40:24.810316517 +0000 UTC m=+0.095162824 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:40:25 np0005466012 nova_compute[192063]: 2025-10-02 12:40:25.553 2 DEBUG nova.compute.manager [req-6076a7a3-e056-400a-afde-c9644c4c4ee3 req-f3dd9c64-0922-4447-93cc-ddb0d41be280 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-vif-unplugged-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:25 np0005466012 nova_compute[192063]: 2025-10-02 12:40:25.553 2 DEBUG oslo_concurrency.lockutils [req-6076a7a3-e056-400a-afde-c9644c4c4ee3 req-f3dd9c64-0922-4447-93cc-ddb0d41be280 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:25 np0005466012 nova_compute[192063]: 2025-10-02 12:40:25.554 2 DEBUG oslo_concurrency.lockutils [req-6076a7a3-e056-400a-afde-c9644c4c4ee3 req-f3dd9c64-0922-4447-93cc-ddb0d41be280 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:25 np0005466012 nova_compute[192063]: 2025-10-02 12:40:25.554 2 DEBUG oslo_concurrency.lockutils [req-6076a7a3-e056-400a-afde-c9644c4c4ee3 req-f3dd9c64-0922-4447-93cc-ddb0d41be280 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:25 np0005466012 nova_compute[192063]: 2025-10-02 12:40:25.555 2 DEBUG nova.compute.manager [req-6076a7a3-e056-400a-afde-c9644c4c4ee3 req-f3dd9c64-0922-4447-93cc-ddb0d41be280 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] No waiting events found dispatching network-vif-unplugged-fe50e34b-11c9-412a-bcca-f81589b7c561 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:25 np0005466012 nova_compute[192063]: 2025-10-02 12:40:25.555 2 DEBUG nova.compute.manager [req-6076a7a3-e056-400a-afde-c9644c4c4ee3 req-f3dd9c64-0922-4447-93cc-ddb0d41be280 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-vif-unplugged-fe50e34b-11c9-412a-bcca-f81589b7c561 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.633 2 DEBUG nova.network.neutron [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.688 2 INFO nova.compute.manager [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Took 3.64 seconds to deallocate network for instance.#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.831 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.832 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.961 2 DEBUG nova.compute.provider_tree [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:40:26 np0005466012 nova_compute[192063]: 2025-10-02 12:40:26.991 2 DEBUG nova.scheduler.client.report [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.058 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.102 2 INFO nova.scheduler.client.report [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Deleted allocations for instance 78166c3e-137c-497b-bb30-a8825a6d4968#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.163 2 DEBUG nova.network.neutron [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updated VIF entry in instance network info cache for port fe50e34b-11c9-412a-bcca-f81589b7c561. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.164 2 DEBUG nova.network.neutron [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Updating instance_info_cache with network_info: [{"id": "fe50e34b-11c9-412a-bcca-f81589b7c561", "address": "fa:16:3e:fe:ee:8e", "network": {"id": "b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177", "bridge": "br-int", "label": "tempest-network-smoke--1389422686", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:ee8e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fd801958556f4c8aab047ecdef6b5ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe50e34b-11", "ovs_interfaceid": "fe50e34b-11c9-412a-bcca-f81589b7c561", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.243 2 DEBUG oslo_concurrency.lockutils [req-6557b62f-88f2-402d-9487-55577f39c053 req-bf26efab-c971-4666-8c94-934422eaf3a5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-78166c3e-137c-497b-bb30-a8825a6d4968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.297 2 DEBUG oslo_concurrency.lockutils [None req-e05c83a5-1323-4419-b84b-ee09552c18b5 97ce9f1898484e0e9a1f7c84a9f0dfe3 fd801958556f4c8aab047ecdef6b5ee8 - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.678 2 DEBUG nova.compute.manager [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.679 2 DEBUG oslo_concurrency.lockutils [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.680 2 DEBUG oslo_concurrency.lockutils [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.681 2 DEBUG oslo_concurrency.lockutils [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "78166c3e-137c-497b-bb30-a8825a6d4968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.681 2 DEBUG nova.compute.manager [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] No waiting events found dispatching network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.682 2 WARNING nova.compute.manager [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received unexpected event network-vif-plugged-fe50e34b-11c9-412a-bcca-f81589b7c561 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.682 2 DEBUG nova.compute.manager [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Received event network-vif-deleted-fe50e34b-11c9-412a-bcca-f81589b7c561 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.683 2 INFO nova.compute.manager [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Neutron deleted interface fe50e34b-11c9-412a-bcca-f81589b7c561; detaching it from the instance and deleting it from the info cache#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.683 2 DEBUG nova.network.neutron [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.687 2 DEBUG nova.compute.manager [req-22a5e844-01ce-431f-ac7d-f81b0a905eda req-e5d8c0a6-6f32-42f8-abb0-0e3919e0a3f2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Detach interface failed, port_id=fe50e34b-11c9-412a-bcca-f81589b7c561, reason: Instance 78166c3e-137c-497b-bb30-a8825a6d4968 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  2 08:40:27 np0005466012 nova_compute[192063]: 2025-10-02 12:40:27.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:28 np0005466012 nova_compute[192063]: 2025-10-02 12:40:28.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:32 np0005466012 nova_compute[192063]: 2025-10-02 12:40:32.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:33 np0005466012 nova_compute[192063]: 2025-10-02 12:40:33.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:34 np0005466012 nova_compute[192063]: 2025-10-02 12:40:34.856 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:36 np0005466012 nova_compute[192063]: 2025-10-02 12:40:36.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:37 np0005466012 nova_compute[192063]: 2025-10-02 12:40:37.887 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408822.884869, 78166c3e-137c-497b-bb30-a8825a6d4968 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:40:37 np0005466012 nova_compute[192063]: 2025-10-02 12:40:37.887 2 INFO nova.compute.manager [-] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:40:37 np0005466012 nova_compute[192063]: 2025-10-02 12:40:37.915 2 DEBUG nova.compute.manager [None req-8a76ef8a-f113-4465-8f28-4a1475fb1e91 - - - - - -] [instance: 78166c3e-137c-497b-bb30-a8825a6d4968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:40:37 np0005466012 nova_compute[192063]: 2025-10-02 12:40:37.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:38 np0005466012 podman[249494]: 2025-10-02 12:40:38.14465665 +0000 UTC m=+0.060317935 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:40:38 np0005466012 podman[249495]: 2025-10-02 12:40:38.145077872 +0000 UTC m=+0.057760984 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, maintainer=Red Hat, Inc.)
Oct  2 08:40:38 np0005466012 nova_compute[192063]: 2025-10-02 12:40:38.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:38.399 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8::f816:3eff:fe2b:a121'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fce3bea-36c3-4b1e-bdee-b694cf8990ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad07d234-3bc8-429a-8834-7a9ae3274be2) old=Port_Binding(mac=['fa:16:3e:2b:a1:21 10.100.0.2 2001:db8:0:1:f816:3eff:fe2b:a121 2001:db8::f816:3eff:fe2b:a121'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2b:a121/64 2001:db8::f816:3eff:fe2b:a121/64', 'neutron:device_id': 'ovnmeta-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:38.401 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad07d234-3bc8-429a-8834-7a9ae3274be2 in datapath b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177 updated#033[00m
Oct  2 08:40:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:38.401 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9d6d69e-0327-4bcf-b8a6-b2cf69a4d177, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:38.402 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[dd974cc4-a337-4902-90ea-3b613c5dc730]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:40:39 np0005466012 nova_compute[192063]: 2025-10-02 12:40:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:40 np0005466012 nova_compute[192063]: 2025-10-02 12:40:40.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:40 np0005466012 nova_compute[192063]: 2025-10-02 12:40:40.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:42 np0005466012 podman[249537]: 2025-10-02 12:40:42.139864924 +0000 UTC m=+0.059239986 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct  2 08:40:42 np0005466012 podman[249538]: 2025-10-02 12:40:42.161044998 +0000 UTC m=+0.078550588 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:40:42 np0005466012 nova_compute[192063]: 2025-10-02 12:40:42.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:43 np0005466012 nova_compute[192063]: 2025-10-02 12:40:43.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:44 np0005466012 nova_compute[192063]: 2025-10-02 12:40:44.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:46 np0005466012 nova_compute[192063]: 2025-10-02 12:40:46.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:47 np0005466012 nova_compute[192063]: 2025-10-02 12:40:47.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:48 np0005466012 nova_compute[192063]: 2025-10-02 12:40:48.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:48 np0005466012 nova_compute[192063]: 2025-10-02 12:40:48.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:48 np0005466012 nova_compute[192063]: 2025-10-02 12:40:48.882 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:48 np0005466012 nova_compute[192063]: 2025-10-02 12:40:48.883 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:48 np0005466012 nova_compute[192063]: 2025-10-02 12:40:48.883 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:48 np0005466012 nova_compute[192063]: 2025-10-02 12:40:48.884 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.038 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.039 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5737MB free_disk=73.24261093139648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.039 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.040 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.193 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.194 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.397 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.471 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.527 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:40:49 np0005466012 nova_compute[192063]: 2025-10-02 12:40:49.528 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:40:51 np0005466012 nova_compute[192063]: 2025-10-02 12:40:51.529 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:51 np0005466012 nova_compute[192063]: 2025-10-02 12:40:51.530 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:40:51 np0005466012 nova_compute[192063]: 2025-10-02 12:40:51.530 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:40:51 np0005466012 nova_compute[192063]: 2025-10-02 12:40:51.574 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:40:52 np0005466012 nova_compute[192063]: 2025-10-02 12:40:52.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:52 np0005466012 nova_compute[192063]: 2025-10-02 12:40:52.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:52 np0005466012 nova_compute[192063]: 2025-10-02 12:40:52.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:40:52 np0005466012 nova_compute[192063]: 2025-10-02 12:40:52.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:53 np0005466012 nova_compute[192063]: 2025-10-02 12:40:53.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:55 np0005466012 podman[249581]: 2025-10-02 12:40:55.151186562 +0000 UTC m=+0.066820909 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  2 08:40:55 np0005466012 podman[249583]: 2025-10-02 12:40:55.151322205 +0000 UTC m=+0.046984770 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:40:55 np0005466012 podman[249582]: 2025-10-02 12:40:55.152936941 +0000 UTC m=+0.055383717 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:40:55 np0005466012 podman[249589]: 2025-10-02 12:40:55.179452855 +0000 UTC m=+0.078128455 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Oct  2 08:40:55 np0005466012 nova_compute[192063]: 2025-10-02 12:40:55.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:40:57 np0005466012 nova_compute[192063]: 2025-10-02 12:40:57.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:58 np0005466012 nova_compute[192063]: 2025-10-02 12:40:58.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:40:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:59.118 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:9b:5a 10.100.0.2 2001:db8::f816:3eff:fec1:9b5a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec1:9b5a/64', 'neutron:device_id': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d90df5bc-8770-4be5-937c-0abfe33bbe11, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f69cd95a-5b20-4a47-8acc-7e190d1dac4c) old=Port_Binding(mac=['fa:16:3e:c1:9b:5a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f50473-7465-4325-8b4d-bb57fca0162f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd801958556f4c8aab047ecdef6b5ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:40:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:59.119 103246 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f69cd95a-5b20-4a47-8acc-7e190d1dac4c in datapath c4f50473-7465-4325-8b4d-bb57fca0162f updated#033[00m
Oct  2 08:40:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:59.120 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c4f50473-7465-4325-8b4d-bb57fca0162f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:40:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:40:59.121 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3d496fdf-912e-4635-9df9-ba4141a27aa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:02.156 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:02.157 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:02.157 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:03 np0005466012 nova_compute[192063]: 2025-10-02 12:41:03.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:03 np0005466012 nova_compute[192063]: 2025-10-02 12:41:03.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:08 np0005466012 nova_compute[192063]: 2025-10-02 12:41:08.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:08 np0005466012 nova_compute[192063]: 2025-10-02 12:41:08.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:09 np0005466012 podman[249668]: 2025-10-02 12:41:09.152513751 +0000 UTC m=+0.068114294 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:41:09 np0005466012 podman[249669]: 2025-10-02 12:41:09.152641095 +0000 UTC m=+0.054820451 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Oct  2 08:41:13 np0005466012 nova_compute[192063]: 2025-10-02 12:41:13.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:13 np0005466012 podman[249710]: 2025-10-02 12:41:13.133044321 +0000 UTC m=+0.051268391 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:41:13 np0005466012 podman[249709]: 2025-10-02 12:41:13.161008516 +0000 UTC m=+0.072170028 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:41:13 np0005466012 nova_compute[192063]: 2025-10-02 12:41:13.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:41:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:41:18 np0005466012 nova_compute[192063]: 2025-10-02 12:41:18.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:18 np0005466012 nova_compute[192063]: 2025-10-02 12:41:18.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:19.424 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:19 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:19.424 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:41:19 np0005466012 nova_compute[192063]: 2025-10-02 12:41:19.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:21Z|00695|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  2 08:41:23 np0005466012 nova_compute[192063]: 2025-10-02 12:41:23.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:23 np0005466012 nova_compute[192063]: 2025-10-02 12:41:23.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:25.426 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:26 np0005466012 podman[249752]: 2025-10-02 12:41:26.172644465 +0000 UTC m=+0.078402743 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  2 08:41:26 np0005466012 podman[249753]: 2025-10-02 12:41:26.179912389 +0000 UTC m=+0.082335204 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:41:26 np0005466012 podman[249751]: 2025-10-02 12:41:26.180195068 +0000 UTC m=+0.089814475 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:41:26 np0005466012 podman[249754]: 2025-10-02 12:41:26.231478538 +0000 UTC m=+0.125810755 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:41:28 np0005466012 nova_compute[192063]: 2025-10-02 12:41:28.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:28 np0005466012 nova_compute[192063]: 2025-10-02 12:41:28.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:31 np0005466012 nova_compute[192063]: 2025-10-02 12:41:31.858 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:31 np0005466012 nova_compute[192063]: 2025-10-02 12:41:31.860 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:31 np0005466012 nova_compute[192063]: 2025-10-02 12:41:31.892 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.116 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.117 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.125 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.125 2 INFO nova.compute.claims [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.344 2 DEBUG nova.compute.provider_tree [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.371 2 DEBUG nova.scheduler.client.report [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.434 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.435 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.537 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.537 2 DEBUG nova.network.neutron [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.554 2 INFO nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.571 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.744 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.746 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.747 2 INFO nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Creating image(s)#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.748 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "/var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.749 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.750 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "/var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.768 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.838 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.840 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.840 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.852 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.899 2 DEBUG nova.policy [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1898fdf056c4a249c33590f26d4d845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.910 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.910 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.946 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.947 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:32 np0005466012 nova_compute[192063]: 2025-10-02 12:41:32.948 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.003 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.005 2 DEBUG nova.virt.disk.api [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Checking if we can resize image /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.006 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.106 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.107 2 DEBUG nova.virt.disk.api [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Cannot resize image /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.107 2 DEBUG nova.objects.instance [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'migration_context' on Instance uuid 01ce1a65-2bfb-487a-9053-ddc724f94f57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.126 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.126 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Ensure instance console log exists: /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.127 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.127 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.127 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:33 np0005466012 nova_compute[192063]: 2025-10-02 12:41:33.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:34 np0005466012 nova_compute[192063]: 2025-10-02 12:41:34.203 2 DEBUG nova.network.neutron [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Successfully created port: 7e283be0-771a-4cf4-858b-48a2944f1217 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:41:34 np0005466012 nova_compute[192063]: 2025-10-02 12:41:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.276 2 DEBUG nova.network.neutron [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Successfully updated port: 7e283be0-771a-4cf4-858b-48a2944f1217 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.381 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.382 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquired lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.382 2 DEBUG nova.network.neutron [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.522 2 DEBUG nova.compute.manager [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-changed-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.523 2 DEBUG nova.compute.manager [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Refreshing instance network info cache due to event network-changed-7e283be0-771a-4cf4-858b-48a2944f1217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:35 np0005466012 nova_compute[192063]: 2025-10-02 12:41:35.524 2 DEBUG oslo_concurrency.lockutils [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:36 np0005466012 nova_compute[192063]: 2025-10-02 12:41:36.392 2 DEBUG nova.network.neutron [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.854 2 DEBUG nova.network.neutron [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updating instance_info_cache with network_info: [{"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.892 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Releasing lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.893 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Instance network_info: |[{"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.894 2 DEBUG oslo_concurrency.lockutils [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.894 2 DEBUG nova.network.neutron [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Refreshing network info cache for port 7e283be0-771a-4cf4-858b-48a2944f1217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.900 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Start _get_guest_xml network_info=[{"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.906 2 WARNING nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.913 2 DEBUG nova.virt.libvirt.host [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.914 2 DEBUG nova.virt.libvirt.host [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.917 2 DEBUG nova.virt.libvirt.host [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.918 2 DEBUG nova.virt.libvirt.host [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.920 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.920 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.920 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.921 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.921 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.921 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.922 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.922 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.922 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.923 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.923 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.923 2 DEBUG nova.virt.hardware [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.927 2 DEBUG nova.virt.libvirt.vif [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1461123043',display_name='tempest-TestNetworkBasicOps-server-1461123043',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1461123043',id=173,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvO4RGw4DIgFtHIyqmZkLqxdAQOPTfVmgvaWyqnxxvbrPF7PGkyIl9petezS+tMNVU1Z3ooiP9mFUEKy75+9J5O0xzF1/iTQ9vH6LXxGSWBne+Vu3cVEJEQc17VIoAChA==',key_name='tempest-TestNetworkBasicOps-1854914014',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-5z00msmc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=01ce1a65-2bfb-487a-9053-ddc724f94f57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.928 2 DEBUG nova.network.os_vif_util [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.929 2 DEBUG nova.network.os_vif_util [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.930 2 DEBUG nova.objects.instance [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 01ce1a65-2bfb-487a-9053-ddc724f94f57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.946 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <uuid>01ce1a65-2bfb-487a-9053-ddc724f94f57</uuid>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <name>instance-000000ad</name>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestNetworkBasicOps-server-1461123043</nova:name>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:41:37</nova:creationTime>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:user uuid="a1898fdf056c4a249c33590f26d4d845">tempest-TestNetworkBasicOps-1323893370-project-member</nova:user>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:project uuid="6e2a4899168a47618e377cb3ac85ddd2">tempest-TestNetworkBasicOps-1323893370</nova:project>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        <nova:port uuid="7e283be0-771a-4cf4-858b-48a2944f1217">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <entry name="serial">01ce1a65-2bfb-487a-9053-ddc724f94f57</entry>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <entry name="uuid">01ce1a65-2bfb-487a-9053-ddc724f94f57</entry>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.config"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:eb:a6:b3"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <target dev="tap7e283be0-77"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/console.log" append="off"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:41:37 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:41:37 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:41:37 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:41:37 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.948 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Preparing to wait for external event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.948 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.949 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.949 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.950 2 DEBUG nova.virt.libvirt.vif [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1461123043',display_name='tempest-TestNetworkBasicOps-server-1461123043',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1461123043',id=173,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvO4RGw4DIgFtHIyqmZkLqxdAQOPTfVmgvaWyqnxxvbrPF7PGkyIl9petezS+tMNVU1Z3ooiP9mFUEKy75+9J5O0xzF1/iTQ9vH6LXxGSWBne+Vu3cVEJEQc17VIoAChA==',key_name='tempest-TestNetworkBasicOps-1854914014',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-5z00msmc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:41:32Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=01ce1a65-2bfb-487a-9053-ddc724f94f57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.951 2 DEBUG nova.network.os_vif_util [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.952 2 DEBUG nova.network.os_vif_util [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.952 2 DEBUG os_vif [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.954 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.954 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e283be0-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e283be0-77, col_values=(('external_ids', {'iface-id': '7e283be0-771a-4cf4-858b-48a2944f1217', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:a6:b3', 'vm-uuid': '01ce1a65-2bfb-487a-9053-ddc724f94f57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:37 np0005466012 NetworkManager[51207]: <info>  [1759408897.9657] manager: (tap7e283be0-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:37 np0005466012 nova_compute[192063]: 2025-10-02 12:41:37.971 2 INFO os_vif [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77')#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.024 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.024 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.025 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] No VIF found with MAC fa:16:3e:eb:a6:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.025 2 INFO nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Using config drive#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.613 2 INFO nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Creating config drive at /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.config#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.624 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp51s60lio execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.760 2 DEBUG oslo_concurrency.processutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp51s60lio" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:38 np0005466012 kernel: tap7e283be0-77: entered promiscuous mode
Oct  2 08:41:38 np0005466012 NetworkManager[51207]: <info>  [1759408898.8563] manager: (tap7e283be0-77): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Oct  2 08:41:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:38Z|00696|binding|INFO|Claiming lport 7e283be0-771a-4cf4-858b-48a2944f1217 for this chassis.
Oct  2 08:41:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:38Z|00697|binding|INFO|7e283be0-771a-4cf4-858b-48a2944f1217: Claiming fa:16:3e:eb:a6:b3 10.100.0.10
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:38 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:38 np0005466012 NetworkManager[51207]: <info>  [1759408898.8738] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Oct  2 08:41:38 np0005466012 NetworkManager[51207]: <info>  [1759408898.8748] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.880 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:a6:b3 10.100.0.10'], port_security=['fa:16:3e:eb:a6:b3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '01ce1a65-2bfb-487a-9053-ddc724f94f57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a127238-c3fd-4117-ae39-3087c30f09a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5b11fdb8-96a0-4165-805b-d08939facca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dbca848-bd3c-415e-9cb9-ed4c61904df1, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7e283be0-771a-4cf4-858b-48a2944f1217) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.902 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7e283be0-771a-4cf4-858b-48a2944f1217 in datapath 3a127238-c3fd-4117-ae39-3087c30f09a1 bound to our chassis#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.903 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a127238-c3fd-4117-ae39-3087c30f09a1#033[00m
Oct  2 08:41:38 np0005466012 systemd-udevd[249865]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.920 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab15965-865b-4ddb-8043-73d4b36e2f07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.921 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a127238-c1 in ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.925 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a127238-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.925 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[89057a7b-5957-4fbd-b0cc-f21337d2c39e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.926 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0eb0c8-ef25-4c0e-81ac-3caad975465f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:38 np0005466012 systemd-machined[152114]: New machine qemu-78-instance-000000ad.
Oct  2 08:41:38 np0005466012 NetworkManager[51207]: <info>  [1759408898.9373] device (tap7e283be0-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:41:38 np0005466012 NetworkManager[51207]: <info>  [1759408898.9380] device (tap7e283be0-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.943 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ca364744-31bf-4096-aa91-3cab17d58c4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:38.983 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e7766826-b495-48fb-91ff-464a81ffa488]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:38.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 systemd[1]: Started Virtual Machine qemu-78-instance-000000ad.
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.019 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[663823bf-fec7-4130-b8a9-2875cb7c38e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:39Z|00698|binding|INFO|Setting lport 7e283be0-771a-4cf4-858b-48a2944f1217 ovn-installed in OVS
Oct  2 08:41:39 np0005466012 NetworkManager[51207]: <info>  [1759408899.0411] manager: (tap3a127238-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/329)
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.040 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f86b0592-9280-4664-9f5c-438e07a3fdc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:39Z|00699|binding|INFO|Setting lport 7e283be0-771a-4cf4-858b-48a2944f1217 up in Southbound
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.077 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[35b291b8-1a04-4814-a630-5bb9f6d0e920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.081 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[cacd26eb-bf0e-4a21-9ec1-b96b834faf1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 NetworkManager[51207]: <info>  [1759408899.1181] device (tap3a127238-c0): carrier: link connected
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.123 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6e271172-5029-4025-8de8-45d95807fc57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.139 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[94cff427-8851-4eec-b231-a42bfd99bfe8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a127238-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:98:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689272, 'reachable_time': 43940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249899, 'error': None, 'target': 'ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.154 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e80cd4eb-200d-446b-909f-13c855243866]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe95:98dc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 689272, 'tstamp': 689272}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249900, 'error': None, 'target': 'ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.171 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[b50667e9-2265-43c3-abcc-f37da33354d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a127238-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:98:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689272, 'reachable_time': 43940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249901, 'error': None, 'target': 'ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.201 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb5954f-b24f-4fd5-8bf1-ccb35b3e08ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.255 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[78796aff-4c65-4c26-9a61-ad92d90617ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.256 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a127238-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.256 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:41:39 np0005466012 kernel: tap3a127238-c0: entered promiscuous mode
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.257 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a127238-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:39 np0005466012 NetworkManager[51207]: <info>  [1759408899.2591] manager: (tap3a127238-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.264 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a127238-c0, col_values=(('external_ids', {'iface-id': '5f0a8cdb-b85d-4bfc-8a2d-3f1f2d39612e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:41:39 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:39Z|00700|binding|INFO|Releasing lport 5f0a8cdb-b85d-4bfc-8a2d-3f1f2d39612e from this chassis (sb_readonly=0)
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.280 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a127238-c3fd-4117-ae39-3087c30f09a1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a127238-c3fd-4117-ae39-3087c30f09a1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.281 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d1067a17-8f48-4d8e-9d91-af1f69b9dec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.283 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-3a127238-c3fd-4117-ae39-3087c30f09a1
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/3a127238-c3fd-4117-ae39-3087c30f09a1.pid.haproxy
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 3a127238-c3fd-4117-ae39-3087c30f09a1
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:41:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:41:39.284 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1', 'env', 'PROCESS_TAG=haproxy-3a127238-c3fd-4117-ae39-3087c30f09a1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a127238-c3fd-4117-ae39-3087c30f09a1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.637 2 DEBUG nova.compute.manager [req-e38f7fde-2f0e-491b-879d-3259947eb0ee req-a38de55b-9adc-47de-b980-6bfb95bbda31 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.638 2 DEBUG oslo_concurrency.lockutils [req-e38f7fde-2f0e-491b-879d-3259947eb0ee req-a38de55b-9adc-47de-b980-6bfb95bbda31 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.639 2 DEBUG oslo_concurrency.lockutils [req-e38f7fde-2f0e-491b-879d-3259947eb0ee req-a38de55b-9adc-47de-b980-6bfb95bbda31 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.639 2 DEBUG oslo_concurrency.lockutils [req-e38f7fde-2f0e-491b-879d-3259947eb0ee req-a38de55b-9adc-47de-b980-6bfb95bbda31 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.640 2 DEBUG nova.compute.manager [req-e38f7fde-2f0e-491b-879d-3259947eb0ee req-a38de55b-9adc-47de-b980-6bfb95bbda31 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Processing event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:41:39 np0005466012 podman[249940]: 2025-10-02 12:41:39.738947004 +0000 UTC m=+0.084542166 container create 09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:41:39 np0005466012 systemd[1]: Started libpod-conmon-09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e.scope.
Oct  2 08:41:39 np0005466012 podman[249940]: 2025-10-02 12:41:39.695670908 +0000 UTC m=+0.041266080 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:41:39 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:41:39 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22f18c358c4b8ba89c0dd60ba9f10f20d5aac6880196b5c96d52bcf97c4da3a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:41:39 np0005466012 podman[249940]: 2025-10-02 12:41:39.816204354 +0000 UTC m=+0.161799526 container init 09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:39 np0005466012 podman[249940]: 2025-10-02 12:41:39.823778947 +0000 UTC m=+0.169374099 container start 09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:41:39 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [NOTICE]   (249990) : New worker (250000) forked
Oct  2 08:41:39 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [NOTICE]   (249990) : Loading success.
Oct  2 08:41:39 np0005466012 podman[249953]: 2025-10-02 12:41:39.850727494 +0000 UTC m=+0.070069750 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible)
Oct  2 08:41:39 np0005466012 podman[249956]: 2025-10-02 12:41:39.850833646 +0000 UTC m=+0.066034205 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.895 2 DEBUG nova.network.neutron [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updated VIF entry in instance network info cache for port 7e283be0-771a-4cf4-858b-48a2944f1217. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.896 2 DEBUG nova.network.neutron [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updating instance_info_cache with network_info: [{"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:39 np0005466012 nova_compute[192063]: 2025-10-02 12:41:39.912 2 DEBUG oslo_concurrency.lockutils [req-672d7adb-78dc-4431-aa0e-a98e2c241138 req-68378569-dc47-49ea-aac4-420602e8187a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.074 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.076 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408900.0745144, 01ce1a65-2bfb-487a-9053-ddc724f94f57 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.076 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] VM Started (Lifecycle Event)#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.082 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.086 2 INFO nova.virt.libvirt.driver [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Instance spawned successfully.#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.086 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.108 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.108 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.109 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.109 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.109 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.110 2 DEBUG nova.virt.libvirt.driver [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.122 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.124 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.167 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.168 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408900.075397, 01ce1a65-2bfb-487a-9053-ddc724f94f57 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.168 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.187 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.192 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759408900.0815992, 01ce1a65-2bfb-487a-9053-ddc724f94f57 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.192 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.219 2 INFO nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Took 7.47 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.220 2 DEBUG nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.228 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.231 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.264 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.308 2 INFO nova.compute.manager [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Took 8.33 seconds to build instance.#033[00m
Oct  2 08:41:40 np0005466012 nova_compute[192063]: 2025-10-02 12:41:40.338 2 DEBUG oslo_concurrency.lockutils [None req-1f5bc3dc-bd0c-4668-9c45-fef833f02e3f a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.478s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:41 np0005466012 nova_compute[192063]: 2025-10-02 12:41:41.789 2 DEBUG nova.compute.manager [req-e1cd07a7-4d7a-470c-8f6a-e2499c82bf2c req-aedc8c1f-785f-4ba5-8576-bdc05f95dcf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:41 np0005466012 nova_compute[192063]: 2025-10-02 12:41:41.790 2 DEBUG oslo_concurrency.lockutils [req-e1cd07a7-4d7a-470c-8f6a-e2499c82bf2c req-aedc8c1f-785f-4ba5-8576-bdc05f95dcf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:41 np0005466012 nova_compute[192063]: 2025-10-02 12:41:41.790 2 DEBUG oslo_concurrency.lockutils [req-e1cd07a7-4d7a-470c-8f6a-e2499c82bf2c req-aedc8c1f-785f-4ba5-8576-bdc05f95dcf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:41 np0005466012 nova_compute[192063]: 2025-10-02 12:41:41.791 2 DEBUG oslo_concurrency.lockutils [req-e1cd07a7-4d7a-470c-8f6a-e2499c82bf2c req-aedc8c1f-785f-4ba5-8576-bdc05f95dcf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:41 np0005466012 nova_compute[192063]: 2025-10-02 12:41:41.791 2 DEBUG nova.compute.manager [req-e1cd07a7-4d7a-470c-8f6a-e2499c82bf2c req-aedc8c1f-785f-4ba5-8576-bdc05f95dcf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] No waiting events found dispatching network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:41:41 np0005466012 nova_compute[192063]: 2025-10-02 12:41:41.792 2 WARNING nova.compute.manager [req-e1cd07a7-4d7a-470c-8f6a-e2499c82bf2c req-aedc8c1f-785f-4ba5-8576-bdc05f95dcf7 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received unexpected event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:41:42 np0005466012 nova_compute[192063]: 2025-10-02 12:41:42.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:43 np0005466012 nova_compute[192063]: 2025-10-02 12:41:43.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:44 np0005466012 podman[250010]: 2025-10-02 12:41:44.180170525 +0000 UTC m=+0.076015326 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:41:44 np0005466012 podman[250009]: 2025-10-02 12:41:44.210725463 +0000 UTC m=+0.107499171 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:41:45 np0005466012 nova_compute[192063]: 2025-10-02 12:41:45.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:46 np0005466012 nova_compute[192063]: 2025-10-02 12:41:46.195 2 DEBUG nova.compute.manager [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-changed-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:41:46 np0005466012 nova_compute[192063]: 2025-10-02 12:41:46.196 2 DEBUG nova.compute.manager [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Refreshing instance network info cache due to event network-changed-7e283be0-771a-4cf4-858b-48a2944f1217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:41:46 np0005466012 nova_compute[192063]: 2025-10-02 12:41:46.197 2 DEBUG oslo_concurrency.lockutils [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:46 np0005466012 nova_compute[192063]: 2025-10-02 12:41:46.197 2 DEBUG oslo_concurrency.lockutils [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:46 np0005466012 nova_compute[192063]: 2025-10-02 12:41:46.197 2 DEBUG nova.network.neutron [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Refreshing network info cache for port 7e283be0-771a-4cf4-858b-48a2944f1217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:41:47 np0005466012 nova_compute[192063]: 2025-10-02 12:41:47.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:47 np0005466012 nova_compute[192063]: 2025-10-02 12:41:47.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:48 np0005466012 nova_compute[192063]: 2025-10-02 12:41:48.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.508 2 DEBUG nova.network.neutron [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updated VIF entry in instance network info cache for port 7e283be0-771a-4cf4-858b-48a2944f1217. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.509 2 DEBUG nova.network.neutron [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updating instance_info_cache with network_info: [{"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.535 2 DEBUG oslo_concurrency.lockutils [req-bd8d50b9-35ca-46ce-8b57-32125eaf1886 req-1649eab3-8cbf-49c1-a705-b0ab7e31cffe 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.850 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.851 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.851 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:41:50 np0005466012 nova_compute[192063]: 2025-10-02 12:41:50.935 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.026 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.028 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.095 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.266 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.268 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5530MB free_disk=73.24174499511719GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.268 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.268 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.364 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 01ce1a65-2bfb-487a-9053-ddc724f94f57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.365 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.365 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.395 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.421 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.422 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.438 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.476 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.528 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.557 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.598 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:41:51 np0005466012 nova_compute[192063]: 2025-10-02 12:41:51.599 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:41:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:52Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:a6:b3 10.100.0.10
Oct  2 08:41:52 np0005466012 ovn_controller[94284]: 2025-10-02T12:41:52Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:a6:b3 10.100.0.10
Oct  2 08:41:52 np0005466012 nova_compute[192063]: 2025-10-02 12:41:52.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:53 np0005466012 nova_compute[192063]: 2025-10-02 12:41:53.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:53 np0005466012 nova_compute[192063]: 2025-10-02 12:41:53.600 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:53 np0005466012 nova_compute[192063]: 2025-10-02 12:41:53.601 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:41:53 np0005466012 nova_compute[192063]: 2025-10-02 12:41:53.601 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:41:54 np0005466012 nova_compute[192063]: 2025-10-02 12:41:54.627 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:41:54 np0005466012 nova_compute[192063]: 2025-10-02 12:41:54.627 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:41:54 np0005466012 nova_compute[192063]: 2025-10-02 12:41:54.628 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:41:54 np0005466012 nova_compute[192063]: 2025-10-02 12:41:54.628 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 01ce1a65-2bfb-487a-9053-ddc724f94f57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:41:57 np0005466012 podman[250073]: 2025-10-02 12:41:57.156597503 +0000 UTC m=+0.062143586 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:41:57 np0005466012 podman[250072]: 2025-10-02 12:41:57.16181952 +0000 UTC m=+0.074998858 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  2 08:41:57 np0005466012 podman[250074]: 2025-10-02 12:41:57.180238227 +0000 UTC m=+0.072689662 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:41:57 np0005466012 podman[250075]: 2025-10-02 12:41:57.219883641 +0000 UTC m=+0.119314153 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.812 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updating instance_info_cache with network_info: [{"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.831 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.832 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.832 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.833 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.833 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:41:57 np0005466012 nova_compute[192063]: 2025-10-02 12:41:57.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:41:58 np0005466012 nova_compute[192063]: 2025-10-02 12:41:58.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:01.731 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:01 np0005466012 nova_compute[192063]: 2025-10-02 12:42:01.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:01.732 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:42:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:02.158 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:02.159 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:02.160 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:02 np0005466012 nova_compute[192063]: 2025-10-02 12:42:02.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:03 np0005466012 nova_compute[192063]: 2025-10-02 12:42:03.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.939 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.940 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.941 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.941 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.941 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.958 2 INFO nova.compute.manager [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Terminating instance#033[00m
Oct  2 08:42:06 np0005466012 nova_compute[192063]: 2025-10-02 12:42:06.971 2 DEBUG nova.compute.manager [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:42:06 np0005466012 kernel: tap7e283be0-77 (unregistering): left promiscuous mode
Oct  2 08:42:06 np0005466012 NetworkManager[51207]: <info>  [1759408926.9956] device (tap7e283be0-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:42:07Z|00701|binding|INFO|Releasing lport 7e283be0-771a-4cf4-858b-48a2944f1217 from this chassis (sb_readonly=0)
Oct  2 08:42:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:42:07Z|00702|binding|INFO|Setting lport 7e283be0-771a-4cf4-858b-48a2944f1217 down in Southbound
Oct  2 08:42:07 np0005466012 ovn_controller[94284]: 2025-10-02T12:42:07Z|00703|binding|INFO|Removing iface tap7e283be0-77 ovn-installed in OVS
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.024 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:a6:b3 10.100.0.10'], port_security=['fa:16:3e:eb:a6:b3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '01ce1a65-2bfb-487a-9053-ddc724f94f57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a127238-c3fd-4117-ae39-3087c30f09a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e2a4899168a47618e377cb3ac85ddd2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5b11fdb8-96a0-4165-805b-d08939facca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5dbca848-bd3c-415e-9cb9-ed4c61904df1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=7e283be0-771a-4cf4-858b-48a2944f1217) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.026 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 7e283be0-771a-4cf4-858b-48a2944f1217 in datapath 3a127238-c3fd-4117-ae39-3087c30f09a1 unbound from our chassis#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.028 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a127238-c3fd-4117-ae39-3087c30f09a1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.030 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6b772bdc-2586-4cd3-bc4b-5e0c2212e8d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.031 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1 namespace which is not needed anymore#033[00m
Oct  2 08:42:07 np0005466012 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Oct  2 08:42:07 np0005466012 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000ad.scope: Consumed 14.393s CPU time.
Oct  2 08:42:07 np0005466012 systemd-machined[152114]: Machine qemu-78-instance-000000ad terminated.
Oct  2 08:42:07 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [NOTICE]   (249990) : haproxy version is 2.8.14-c23fe91
Oct  2 08:42:07 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [NOTICE]   (249990) : path to executable is /usr/sbin/haproxy
Oct  2 08:42:07 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [WARNING]  (249990) : Exiting Master process...
Oct  2 08:42:07 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [ALERT]    (249990) : Current worker (250000) exited with code 143 (Terminated)
Oct  2 08:42:07 np0005466012 neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1[249959]: [WARNING]  (249990) : All workers exited. Exiting... (0)
Oct  2 08:42:07 np0005466012 systemd[1]: libpod-09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e.scope: Deactivated successfully.
Oct  2 08:42:07 np0005466012 conmon[249959]: conmon 09a9d02e3344b63fcc12 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e.scope/container/memory.events
Oct  2 08:42:07 np0005466012 podman[250181]: 2025-10-02 12:42:07.176095089 +0000 UTC m=+0.051890259 container died 09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:42:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e-userdata-shm.mount: Deactivated successfully.
Oct  2 08:42:07 np0005466012 systemd[1]: var-lib-containers-storage-overlay-22f18c358c4b8ba89c0dd60ba9f10f20d5aac6880196b5c96d52bcf97c4da3a7-merged.mount: Deactivated successfully.
Oct  2 08:42:07 np0005466012 podman[250181]: 2025-10-02 12:42:07.226415533 +0000 UTC m=+0.102210693 container cleanup 09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.237 2 INFO nova.virt.libvirt.driver [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Instance destroyed successfully.#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.238 2 DEBUG nova.objects.instance [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lazy-loading 'resources' on Instance uuid 01ce1a65-2bfb-487a-9053-ddc724f94f57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:42:07 np0005466012 systemd[1]: libpod-conmon-09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e.scope: Deactivated successfully.
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.250 2 DEBUG nova.virt.libvirt.vif [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1461123043',display_name='tempest-TestNetworkBasicOps-server-1461123043',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1461123043',id=173,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvO4RGw4DIgFtHIyqmZkLqxdAQOPTfVmgvaWyqnxxvbrPF7PGkyIl9petezS+tMNVU1Z3ooiP9mFUEKy75+9J5O0xzF1/iTQ9vH6LXxGSWBne+Vu3cVEJEQc17VIoAChA==',key_name='tempest-TestNetworkBasicOps-1854914014',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:41:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e2a4899168a47618e377cb3ac85ddd2',ramdisk_id='',reservation_id='r-5z00msmc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1323893370',owner_user_name='tempest-TestNetworkBasicOps-1323893370-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:41:40Z,user_data=None,user_id='a1898fdf056c4a249c33590f26d4d845',uuid=01ce1a65-2bfb-487a-9053-ddc724f94f57,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.250 2 DEBUG nova.network.os_vif_util [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converting VIF {"id": "7e283be0-771a-4cf4-858b-48a2944f1217", "address": "fa:16:3e:eb:a6:b3", "network": {"id": "3a127238-c3fd-4117-ae39-3087c30f09a1", "bridge": "br-int", "label": "tempest-network-smoke--12525199", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e2a4899168a47618e377cb3ac85ddd2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e283be0-77", "ovs_interfaceid": "7e283be0-771a-4cf4-858b-48a2944f1217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.251 2 DEBUG nova.network.os_vif_util [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.252 2 DEBUG os_vif [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.254 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e283be0-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.259 2 INFO os_vif [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:a6:b3,bridge_name='br-int',has_traffic_filtering=True,id=7e283be0-771a-4cf4-858b-48a2944f1217,network=Network(3a127238-c3fd-4117-ae39-3087c30f09a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e283be0-77')#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.259 2 INFO nova.virt.libvirt.driver [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Deleting instance files /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57_del#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.260 2 INFO nova.virt.libvirt.driver [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Deletion of /var/lib/nova/instances/01ce1a65-2bfb-487a-9053-ddc724f94f57_del complete#033[00m
Oct  2 08:42:07 np0005466012 podman[250227]: 2025-10-02 12:42:07.299494396 +0000 UTC m=+0.046127327 container remove 09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.305 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[49f1a781-db65-49ec-b42f-fa83cc5c89da]: (4, ('Thu Oct  2 12:42:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1 (09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e)\n09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e\nThu Oct  2 12:42:07 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1 (09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e)\n09a9d02e3344b63fcc126b1037d820f5bfb936aa93525589cdc73e71400e251e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.307 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fddec38b-ad4d-42b2-a587-c86b71e4ede4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.308 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a127238-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 kernel: tap3a127238-c0: left promiscuous mode
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.325 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3abbb48a-b6e1-46dc-9ded-3fcb6279f999]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.335 2 INFO nova.compute.manager [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.336 2 DEBUG oslo.service.loopingcall [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.337 2 DEBUG nova.compute.manager [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.337 2 DEBUG nova.network.neutron [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.351 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3d696b25-f377-4f58-9cda-ef21dfcc97b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.352 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cb1b5a20-b42c-495f-9adf-623d1912cd70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.367 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6c411b67-c5a0-492b-98bc-873b6e260352]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689262, 'reachable_time': 34139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250240, 'error': None, 'target': 'ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 systemd[1]: run-netns-ovnmeta\x2d3a127238\x2dc3fd\x2d4117\x2dae39\x2d3087c30f09a1.mount: Deactivated successfully.
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.369 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a127238-c3fd-4117-ae39-3087c30f09a1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:42:07 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:07.369 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ba492ba7-dce2-4158-821d-d92976fae791]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.547 2 DEBUG nova.compute.manager [req-cee22d28-14d6-4b6f-bac6-cb02dcf3eabb req-7d8ecae5-42fd-4490-a88f-ec2a0ece06f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-vif-unplugged-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.547 2 DEBUG oslo_concurrency.lockutils [req-cee22d28-14d6-4b6f-bac6-cb02dcf3eabb req-7d8ecae5-42fd-4490-a88f-ec2a0ece06f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.548 2 DEBUG oslo_concurrency.lockutils [req-cee22d28-14d6-4b6f-bac6-cb02dcf3eabb req-7d8ecae5-42fd-4490-a88f-ec2a0ece06f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.549 2 DEBUG oslo_concurrency.lockutils [req-cee22d28-14d6-4b6f-bac6-cb02dcf3eabb req-7d8ecae5-42fd-4490-a88f-ec2a0ece06f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.549 2 DEBUG nova.compute.manager [req-cee22d28-14d6-4b6f-bac6-cb02dcf3eabb req-7d8ecae5-42fd-4490-a88f-ec2a0ece06f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] No waiting events found dispatching network-vif-unplugged-7e283be0-771a-4cf4-858b-48a2944f1217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.550 2 DEBUG nova.compute.manager [req-cee22d28-14d6-4b6f-bac6-cb02dcf3eabb req-7d8ecae5-42fd-4490-a88f-ec2a0ece06f5 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-vif-unplugged-7e283be0-771a-4cf4-858b-48a2944f1217 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.876 2 DEBUG nova.compute.manager [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-changed-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.877 2 DEBUG nova.compute.manager [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Refreshing instance network info cache due to event network-changed-7e283be0-771a-4cf4-858b-48a2944f1217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.877 2 DEBUG oslo_concurrency.lockutils [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.877 2 DEBUG oslo_concurrency.lockutils [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.877 2 DEBUG nova.network.neutron [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Refreshing network info cache for port 7e283be0-771a-4cf4-858b-48a2944f1217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.916 2 DEBUG nova.network.neutron [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:07 np0005466012 nova_compute[192063]: 2025-10-02 12:42:07.932 2 INFO nova.compute.manager [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Took 0.60 seconds to deallocate network for instance.#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.002 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.002 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.023 2 INFO nova.network.neutron [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Port 7e283be0-771a-4cf4-858b-48a2944f1217 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.024 2 DEBUG nova.network.neutron [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.039 2 DEBUG oslo_concurrency.lockutils [req-901d2ad6-21f8-4533-bb39-33acafe8ca15 req-80ba738d-bdf0-496b-a246-ccf7c23060d2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-01ce1a65-2bfb-487a-9053-ddc724f94f57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.051 2 DEBUG nova.compute.provider_tree [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.063 2 DEBUG nova.scheduler.client.report [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.083 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.103 2 INFO nova.scheduler.client.report [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Deleted allocations for instance 01ce1a65-2bfb-487a-9053-ddc724f94f57#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.171 2 DEBUG oslo_concurrency.lockutils [None req-18bf1d4c-57d2-4577-8bce-efab1c8d87cb a1898fdf056c4a249c33590f26d4d845 6e2a4899168a47618e377cb3ac85ddd2 - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:08 np0005466012 nova_compute[192063]: 2025-10-02 12:42:08.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:09 np0005466012 nova_compute[192063]: 2025-10-02 12:42:09.631 2 DEBUG nova.compute.manager [req-34e2f8d2-f318-4ea5-8c52-23d70e558bad req-eb550559-f2c5-47df-b738-80cc2f369d2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:09 np0005466012 nova_compute[192063]: 2025-10-02 12:42:09.632 2 DEBUG oslo_concurrency.lockutils [req-34e2f8d2-f318-4ea5-8c52-23d70e558bad req-eb550559-f2c5-47df-b738-80cc2f369d2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:09 np0005466012 nova_compute[192063]: 2025-10-02 12:42:09.632 2 DEBUG oslo_concurrency.lockutils [req-34e2f8d2-f318-4ea5-8c52-23d70e558bad req-eb550559-f2c5-47df-b738-80cc2f369d2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:09 np0005466012 nova_compute[192063]: 2025-10-02 12:42:09.632 2 DEBUG oslo_concurrency.lockutils [req-34e2f8d2-f318-4ea5-8c52-23d70e558bad req-eb550559-f2c5-47df-b738-80cc2f369d2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "01ce1a65-2bfb-487a-9053-ddc724f94f57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:09 np0005466012 nova_compute[192063]: 2025-10-02 12:42:09.633 2 DEBUG nova.compute.manager [req-34e2f8d2-f318-4ea5-8c52-23d70e558bad req-eb550559-f2c5-47df-b738-80cc2f369d2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] No waiting events found dispatching network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:42:09 np0005466012 nova_compute[192063]: 2025-10-02 12:42:09.633 2 WARNING nova.compute.manager [req-34e2f8d2-f318-4ea5-8c52-23d70e558bad req-eb550559-f2c5-47df-b738-80cc2f369d2b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received unexpected event network-vif-plugged-7e283be0-771a-4cf4-858b-48a2944f1217 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:42:10 np0005466012 nova_compute[192063]: 2025-10-02 12:42:10.004 2 DEBUG nova.compute.manager [req-cf67ec7c-557c-47de-a154-28c91da5ac38 req-60b124db-e816-4666-9de5-1ec0a2a97025 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Received event network-vif-deleted-7e283be0-771a-4cf4-858b-48a2944f1217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:42:10 np0005466012 podman[250241]: 2025-10-02 12:42:10.151490004 +0000 UTC m=+0.065536613 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:42:10 np0005466012 podman[250242]: 2025-10-02 12:42:10.152521332 +0000 UTC m=+0.067058564 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct  2 08:42:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:10.733 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:12 np0005466012 nova_compute[192063]: 2025-10-02 12:42:12.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:13 np0005466012 nova_compute[192063]: 2025-10-02 12:42:13.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:14 np0005466012 nova_compute[192063]: 2025-10-02 12:42:14.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:14 np0005466012 nova_compute[192063]: 2025-10-02 12:42:14.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:15 np0005466012 podman[250286]: 2025-10-02 12:42:15.142476528 +0000 UTC m=+0.051233082 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:42:15 np0005466012 podman[250285]: 2025-10-02 12:42:15.17563655 +0000 UTC m=+0.087429389 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3)
Oct  2 08:42:17 np0005466012 nova_compute[192063]: 2025-10-02 12:42:17.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:18 np0005466012 nova_compute[192063]: 2025-10-02 12:42:18.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:22 np0005466012 nova_compute[192063]: 2025-10-02 12:42:22.235 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759408927.2345626, 01ce1a65-2bfb-487a-9053-ddc724f94f57 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:42:22 np0005466012 nova_compute[192063]: 2025-10-02 12:42:22.235 2 INFO nova.compute.manager [-] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:42:22 np0005466012 nova_compute[192063]: 2025-10-02 12:42:22.255 2 DEBUG nova.compute.manager [None req-150ab928-68ec-45c7-97b4-58851b30b202 - - - - - -] [instance: 01ce1a65-2bfb-487a-9053-ddc724f94f57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:42:22 np0005466012 nova_compute[192063]: 2025-10-02 12:42:22.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:23 np0005466012 nova_compute[192063]: 2025-10-02 12:42:23.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:27 np0005466012 nova_compute[192063]: 2025-10-02 12:42:27.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:28 np0005466012 podman[250327]: 2025-10-02 12:42:28.171560216 +0000 UTC m=+0.082749365 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  2 08:42:28 np0005466012 podman[250328]: 2025-10-02 12:42:28.207334051 +0000 UTC m=+0.103201680 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:42:28 np0005466012 podman[250329]: 2025-10-02 12:42:28.20764354 +0000 UTC m=+0.105131974 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:42:28 np0005466012 podman[250326]: 2025-10-02 12:42:28.210305925 +0000 UTC m=+0.116726310 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:42:28 np0005466012 nova_compute[192063]: 2025-10-02 12:42:28.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:32 np0005466012 nova_compute[192063]: 2025-10-02 12:42:32.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:33 np0005466012 nova_compute[192063]: 2025-10-02 12:42:33.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:34 np0005466012 nova_compute[192063]: 2025-10-02 12:42:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:37 np0005466012 nova_compute[192063]: 2025-10-02 12:42:37.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:38 np0005466012 nova_compute[192063]: 2025-10-02 12:42:38.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:39 np0005466012 nova_compute[192063]: 2025-10-02 12:42:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:40.207 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:42:40 np0005466012 nova_compute[192063]: 2025-10-02 12:42:40.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:40 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:40.208 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:42:40 np0005466012 nova_compute[192063]: 2025-10-02 12:42:40.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:41 np0005466012 podman[250416]: 2025-10-02 12:42:41.152644165 +0000 UTC m=+0.062762744 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc.)
Oct  2 08:42:41 np0005466012 podman[250415]: 2025-10-02 12:42:41.166811373 +0000 UTC m=+0.070679016 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:42:42 np0005466012 nova_compute[192063]: 2025-10-02 12:42:42.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:43 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:42:43.210 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:42:43 np0005466012 nova_compute[192063]: 2025-10-02 12:42:43.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:45 np0005466012 nova_compute[192063]: 2025-10-02 12:42:45.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:46 np0005466012 podman[250461]: 2025-10-02 12:42:46.146797819 +0000 UTC m=+0.057690212 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:42:46 np0005466012 podman[250460]: 2025-10-02 12:42:46.184510888 +0000 UTC m=+0.101286726 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:42:47 np0005466012 nova_compute[192063]: 2025-10-02 12:42:47.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005466012 nova_compute[192063]: 2025-10-02 12:42:48.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:48 np0005466012 nova_compute[192063]: 2025-10-02 12:42:48.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.849 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:52 np0005466012 nova_compute[192063]: 2025-10-02 12:42:52.850 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.009 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.010 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=73.24260330200195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.010 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.010 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.054 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.055 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.072 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.086 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.107 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.108 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:42:53 np0005466012 nova_compute[192063]: 2025-10-02 12:42:53.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:55 np0005466012 nova_compute[192063]: 2025-10-02 12:42:55.108 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:55 np0005466012 nova_compute[192063]: 2025-10-02 12:42:55.108 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:42:55 np0005466012 nova_compute[192063]: 2025-10-02 12:42:55.108 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:42:55 np0005466012 nova_compute[192063]: 2025-10-02 12:42:55.127 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:42:55 np0005466012 nova_compute[192063]: 2025-10-02 12:42:55.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:55 np0005466012 nova_compute[192063]: 2025-10-02 12:42:55.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:42:57 np0005466012 nova_compute[192063]: 2025-10-02 12:42:57.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:57 np0005466012 nova_compute[192063]: 2025-10-02 12:42:57.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:42:58 np0005466012 nova_compute[192063]: 2025-10-02 12:42:58.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:42:59 np0005466012 podman[250509]: 2025-10-02 12:42:59.176861845 +0000 UTC m=+0.080670117 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:42:59 np0005466012 podman[250507]: 2025-10-02 12:42:59.182273257 +0000 UTC m=+0.094261079 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Oct  2 08:42:59 np0005466012 podman[250508]: 2025-10-02 12:42:59.18274011 +0000 UTC m=+0.092292994 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:42:59 np0005466012 podman[250510]: 2025-10-02 12:42:59.189430088 +0000 UTC m=+0.099175117 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, config_id=ovn_controller)
Oct  2 08:43:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:02.159 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:02.159 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:02.160 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:02 np0005466012 nova_compute[192063]: 2025-10-02 12:43:02.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:03 np0005466012 nova_compute[192063]: 2025-10-02 12:43:03.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:07 np0005466012 nova_compute[192063]: 2025-10-02 12:43:07.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:08 np0005466012 nova_compute[192063]: 2025-10-02 12:43:08.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:12 np0005466012 podman[250593]: 2025-10-02 12:43:12.151874255 +0000 UTC m=+0.060850710 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7)
Oct  2 08:43:12 np0005466012 podman[250592]: 2025-10-02 12:43:12.156621919 +0000 UTC m=+0.070372339 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:43:12 np0005466012 nova_compute[192063]: 2025-10-02 12:43:12.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:13 np0005466012 nova_compute[192063]: 2025-10-02 12:43:13.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:14 np0005466012 nova_compute[192063]: 2025-10-02 12:43:14.876 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:14 np0005466012 nova_compute[192063]: 2025-10-02 12:43:14.876 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:14 np0005466012 nova_compute[192063]: 2025-10-02 12:43:14.892 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.044 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.044 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.054 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.054 2 INFO nova.compute.claims [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.249 2 DEBUG nova.compute.provider_tree [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.274 2 DEBUG nova.scheduler.client.report [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.295 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.296 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.361 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.361 2 DEBUG nova.network.neutron [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.378 2 INFO nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.399 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.510 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.511 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.512 2 INFO nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Creating image(s)#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.512 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "/var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.513 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "/var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.513 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "/var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.525 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.586 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.587 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.588 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.601 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.661 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.662 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.710 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.711 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.711 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.784 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.785 2 DEBUG nova.virt.disk.api [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Checking if we can resize image /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.786 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.844 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.845 2 DEBUG nova.virt.disk.api [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Cannot resize image /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.846 2 DEBUG nova.objects.instance [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 48f5c1dd-1059-42ea-94ab-15808efc147b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.864 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.864 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Ensure instance console log exists: /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.865 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.865 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:15 np0005466012 nova_compute[192063]: 2025-10-02 12:43:15.865 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:16 np0005466012 nova_compute[192063]: 2025-10-02 12:43:16.705 2 DEBUG nova.network.neutron [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Successfully created port: 252c8b47-de4a-47c5-a2e3-02b99df6331e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:43:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:43:17 np0005466012 podman[250645]: 2025-10-02 12:43:17.134944739 +0000 UTC m=+0.052012353 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:43:17 np0005466012 podman[250646]: 2025-10-02 12:43:17.164713594 +0000 UTC m=+0.075649186 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:43:17 np0005466012 nova_compute[192063]: 2025-10-02 12:43:17.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.168 2 DEBUG nova.network.neutron [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Successfully updated port: 252c8b47-de4a-47c5-a2e3-02b99df6331e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.181 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "refresh_cache-48f5c1dd-1059-42ea-94ab-15808efc147b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.181 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquired lock "refresh_cache-48f5c1dd-1059-42ea-94ab-15808efc147b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.181 2 DEBUG nova.network.neutron [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.336 2 DEBUG nova.compute.manager [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received event network-changed-252c8b47-de4a-47c5-a2e3-02b99df6331e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.336 2 DEBUG nova.compute.manager [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Refreshing instance network info cache due to event network-changed-252c8b47-de4a-47c5-a2e3-02b99df6331e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.337 2 DEBUG oslo_concurrency.lockutils [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-48f5c1dd-1059-42ea-94ab-15808efc147b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.384 2 DEBUG nova.network.neutron [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:43:18 np0005466012 nova_compute[192063]: 2025-10-02 12:43:18.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.446 2 DEBUG nova.network.neutron [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Updating instance_info_cache with network_info: [{"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.472 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Releasing lock "refresh_cache-48f5c1dd-1059-42ea-94ab-15808efc147b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.473 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Instance network_info: |[{"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.474 2 DEBUG oslo_concurrency.lockutils [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-48f5c1dd-1059-42ea-94ab-15808efc147b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.474 2 DEBUG nova.network.neutron [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Refreshing network info cache for port 252c8b47-de4a-47c5-a2e3-02b99df6331e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.477 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Start _get_guest_xml network_info=[{"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.482 2 WARNING nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.487 2 DEBUG nova.virt.libvirt.host [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.488 2 DEBUG nova.virt.libvirt.host [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.496 2 DEBUG nova.virt.libvirt.host [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.497 2 DEBUG nova.virt.libvirt.host [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.499 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.500 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.500 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.501 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.501 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.502 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.503 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.503 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.504 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.505 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.505 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.506 2 DEBUG nova.virt.hardware [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.513 2 DEBUG nova.virt.libvirt.vif [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:43:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1676761474',display_name='tempest-TestServerMultinode-server-1676761474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1676761474',id=176,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0acd1c52a26d4654b24111e5ad4814f2',ramdisk_id='',reservation_id='r-y5kvxx3i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1539275040',owner_user_name='tempest-TestServerMultinode-1539275040-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:43:15Z,user_data=None,user_id='7ed2a973cfed4867a095aecf0c6453fb',uuid=48f5c1dd-1059-42ea-94ab-15808efc147b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.514 2 DEBUG nova.network.os_vif_util [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converting VIF {"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.515 2 DEBUG nova.network.os_vif_util [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.517 2 DEBUG nova.objects.instance [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 48f5c1dd-1059-42ea-94ab-15808efc147b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.533 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <uuid>48f5c1dd-1059-42ea-94ab-15808efc147b</uuid>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <name>instance-000000b0</name>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestServerMultinode-server-1676761474</nova:name>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:43:19</nova:creationTime>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:user uuid="7ed2a973cfed4867a095aecf0c6453fb">tempest-TestServerMultinode-1539275040-project-admin</nova:user>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:project uuid="0acd1c52a26d4654b24111e5ad4814f2">tempest-TestServerMultinode-1539275040</nova:project>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        <nova:port uuid="252c8b47-de4a-47c5-a2e3-02b99df6331e">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <entry name="serial">48f5c1dd-1059-42ea-94ab-15808efc147b</entry>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <entry name="uuid">48f5c1dd-1059-42ea-94ab-15808efc147b</entry>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.config"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:82:f5:fc"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <target dev="tap252c8b47-de"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/console.log" append="off"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:43:19 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:43:19 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:43:19 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:43:19 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.535 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Preparing to wait for external event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.535 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.535 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.536 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.537 2 DEBUG nova.virt.libvirt.vif [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:43:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1676761474',display_name='tempest-TestServerMultinode-server-1676761474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1676761474',id=176,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0acd1c52a26d4654b24111e5ad4814f2',ramdisk_id='',reservation_id='r-y5kvxx3i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1539275040',owner_user_name='tempest-TestServerMultinode-1539275040-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:43:15Z,user_data=None,user_id='7ed2a973cfed4867a095aecf0c6453fb',uuid=48f5c1dd-1059-42ea-94ab-15808efc147b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.537 2 DEBUG nova.network.os_vif_util [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converting VIF {"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.538 2 DEBUG nova.network.os_vif_util [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.538 2 DEBUG os_vif [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.541 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.541 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap252c8b47-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap252c8b47-de, col_values=(('external_ids', {'iface-id': '252c8b47-de4a-47c5-a2e3-02b99df6331e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:f5:fc', 'vm-uuid': '48f5c1dd-1059-42ea-94ab-15808efc147b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:43:19 np0005466012 NetworkManager[51207]: <info>  [1759408999.5495] manager: (tap252c8b47-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.558 2 INFO os_vif [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de')#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.811 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.812 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.812 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] No VIF found with MAC fa:16:3e:82:f5:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:43:19 np0005466012 nova_compute[192063]: 2025-10-02 12:43:19.813 2 INFO nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Using config drive#033[00m
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.160 2 INFO nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Creating config drive at /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.config#033[00m
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.164 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8eymf5jx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.308 2 DEBUG oslo_concurrency.processutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8eymf5jx" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:20 np0005466012 kernel: tap252c8b47-de: entered promiscuous mode
Oct  2 08:43:20 np0005466012 NetworkManager[51207]: <info>  [1759409000.3970] manager: (tap252c8b47-de): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:20Z|00704|binding|INFO|Claiming lport 252c8b47-de4a-47c5-a2e3-02b99df6331e for this chassis.
Oct  2 08:43:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:20Z|00705|binding|INFO|252c8b47-de4a-47c5-a2e3-02b99df6331e: Claiming fa:16:3e:82:f5:fc 10.100.0.3
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.414 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:f5:fc 10.100.0.3'], port_security=['fa:16:3e:82:f5:fc 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '48f5c1dd-1059-42ea-94ab-15808efc147b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0acd1c52a26d4654b24111e5ad4814f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f9d53e4-02f4-4598-9a8f-67bc82369860', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c1b0270-8f0a-4540-b305-4a4654e80399, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=252c8b47-de4a-47c5-a2e3-02b99df6331e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.417 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 252c8b47-de4a-47c5-a2e3-02b99df6331e in datapath 89c6a9c2-23c1-4b8b-81b9-3050a42a016f bound to our chassis#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.420 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89c6a9c2-23c1-4b8b-81b9-3050a42a016f#033[00m
Oct  2 08:43:20 np0005466012 systemd-machined[152114]: New machine qemu-79-instance-000000b0.
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.434 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8762bdc7-5eab-439c-b7c2-dae029a35504]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.435 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89c6a9c2-21 in ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.437 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89c6a9c2-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.437 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[cc13a40e-d680-46ca-b1df-2d31d57cbc68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.438 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3e088c3b-9d14-40fb-bf8c-1dd34cf2d938]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 systemd-udevd[250704]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.449 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[3172f3f4-6d5e-4834-92c8-423db56eb6c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 NetworkManager[51207]: <info>  [1759409000.4541] device (tap252c8b47-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:43:20 np0005466012 NetworkManager[51207]: <info>  [1759409000.4555] device (tap252c8b47-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:43:20 np0005466012 systemd[1]: Started Virtual Machine qemu-79-instance-000000b0.
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.467 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[26d4fcb3-2eaa-42d8-9ed5-4e1eb56ab9af]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:20Z|00706|binding|INFO|Setting lport 252c8b47-de4a-47c5-a2e3-02b99df6331e ovn-installed in OVS
Oct  2 08:43:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:20Z|00707|binding|INFO|Setting lport 252c8b47-de4a-47c5-a2e3-02b99df6331e up in Southbound
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.494 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6a2d12-e884-4dc4-bccf-c47a603f9f0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.499 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6eb989-38bf-4ae7-8860-97306d96328d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 NetworkManager[51207]: <info>  [1759409000.5004] manager: (tap89c6a9c2-20): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.532 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[cf312840-ef98-48b8-9bb9-25de74e8340e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.536 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[64336f33-23e4-489b-873e-a22a8a3de47c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 NetworkManager[51207]: <info>  [1759409000.5606] device (tap89c6a9c2-20): carrier: link connected
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.566 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[c9857a70-286d-47e7-ade7-7245003722a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.582 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bb43a481-d642-4122-b66f-87e32bc9185e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89c6a9c2-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:39:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 699417, 'reachable_time': 19636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250736, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.598 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb8192b-4b43-4b40-a8f7-38d0ce7e2420]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:3914'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 699417, 'tstamp': 699417}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250737, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.613 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a06d3d10-9de0-4fda-b726-b5f4454c6987]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89c6a9c2-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:39:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 699417, 'reachable_time': 19636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250738, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.644 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[753269be-c896-4b02-be7d-f1748cf482e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.723 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[98e1b945-4d4c-4864-b13f-1a5bc68ec18c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.725 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89c6a9c2-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.725 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.725 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89c6a9c2-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:20 np0005466012 NetworkManager[51207]: <info>  [1759409000.7277] manager: (tap89c6a9c2-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Oct  2 08:43:20 np0005466012 kernel: tap89c6a9c2-20: entered promiscuous mode
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.729 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89c6a9c2-20, col_values=(('external_ids', {'iface-id': 'f668a745-fb31-4662-9099-e8e7982b3bbb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:20Z|00708|binding|INFO|Releasing lport f668a745-fb31-4662-9099-e8e7982b3bbb from this chassis (sb_readonly=0)
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 nova_compute[192063]: 2025-10-02 12:43:20.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.743 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.744 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[07102a56-4e9c-4e02-8874-5ae361df3b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.745 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-89c6a9c2-23c1-4b8b-81b9-3050a42a016f
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.pid.haproxy
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 89c6a9c2-23c1-4b8b-81b9-3050a42a016f
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:43:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:20.746 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'env', 'PROCESS_TAG=haproxy-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89c6a9c2-23c1-4b8b-81b9-3050a42a016f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:43:21 np0005466012 podman[250770]: 2025-10-02 12:43:21.168471667 +0000 UTC m=+0.116269827 container create b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:43:21 np0005466012 podman[250770]: 2025-10-02 12:43:21.075865395 +0000 UTC m=+0.023663575 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:43:21 np0005466012 systemd[1]: Started libpod-conmon-b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f.scope.
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.280 2 DEBUG nova.network.neutron [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Updated VIF entry in instance network info cache for port 252c8b47-de4a-47c5-a2e3-02b99df6331e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.282 2 DEBUG nova.network.neutron [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Updating instance_info_cache with network_info: [{"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:21 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:43:21 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af234bf810c7b5fe0c616d0b355a02f7ea2c2f6c9ca1ef018287e118294cb48a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.307 2 DEBUG oslo_concurrency.lockutils [req-3e04d664-b1c7-48b4-9395-c4f99e6e2774 req-71c791f1-f4ca-453b-9dec-7aded70fb32c 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-48f5c1dd-1059-42ea-94ab-15808efc147b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:21 np0005466012 podman[250770]: 2025-10-02 12:43:21.323884703 +0000 UTC m=+0.271682883 container init b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  2 08:43:21 np0005466012 podman[250770]: 2025-10-02 12:43:21.329356637 +0000 UTC m=+0.277154837 container start b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:43:21 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [NOTICE]   (250796) : New worker (250798) forked
Oct  2 08:43:21 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [NOTICE]   (250796) : Loading success.
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.599 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409001.598299, 48f5c1dd-1059-42ea-94ab-15808efc147b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.600 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] VM Started (Lifecycle Event)#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.627 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.632 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409001.5999491, 48f5c1dd-1059-42ea-94ab-15808efc147b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.633 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.653 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.656 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:43:21 np0005466012 nova_compute[192063]: 2025-10-02 12:43:21.674 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.675 2 DEBUG nova.compute.manager [req-f5c5ce3e-04d1-4d1e-bd75-ada875d0f965 req-fd163bfd-964c-48e6-a77e-3f7be3fd6225 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.676 2 DEBUG oslo_concurrency.lockutils [req-f5c5ce3e-04d1-4d1e-bd75-ada875d0f965 req-fd163bfd-964c-48e6-a77e-3f7be3fd6225 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.676 2 DEBUG oslo_concurrency.lockutils [req-f5c5ce3e-04d1-4d1e-bd75-ada875d0f965 req-fd163bfd-964c-48e6-a77e-3f7be3fd6225 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.677 2 DEBUG oslo_concurrency.lockutils [req-f5c5ce3e-04d1-4d1e-bd75-ada875d0f965 req-fd163bfd-964c-48e6-a77e-3f7be3fd6225 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.677 2 DEBUG nova.compute.manager [req-f5c5ce3e-04d1-4d1e-bd75-ada875d0f965 req-fd163bfd-964c-48e6-a77e-3f7be3fd6225 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Processing event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.678 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.685 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.686 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409003.685314, 48f5c1dd-1059-42ea-94ab-15808efc147b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.687 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.694 2 INFO nova.virt.libvirt.driver [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Instance spawned successfully.#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.695 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.720 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.729 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.735 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.735 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.736 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.736 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.736 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.737 2 DEBUG nova.virt.libvirt.driver [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.767 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.831 2 INFO nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Took 8.32 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.831 2 DEBUG nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.930 2 INFO nova.compute.manager [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Took 8.94 seconds to build instance.#033[00m
Oct  2 08:43:23 np0005466012 nova_compute[192063]: 2025-10-02 12:43:23.945 2 DEBUG oslo_concurrency.lockutils [None req-3a03c005-8234-45a3-821e-5e571cd72572 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:24 np0005466012 nova_compute[192063]: 2025-10-02 12:43:24.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:25 np0005466012 nova_compute[192063]: 2025-10-02 12:43:25.805 2 DEBUG nova.compute.manager [req-b2ee8171-fd01-4aa8-b030-edcf4acd287e req-341cba1c-ddfc-4b2f-8398-7e667c65021a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:25 np0005466012 nova_compute[192063]: 2025-10-02 12:43:25.805 2 DEBUG oslo_concurrency.lockutils [req-b2ee8171-fd01-4aa8-b030-edcf4acd287e req-341cba1c-ddfc-4b2f-8398-7e667c65021a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:25 np0005466012 nova_compute[192063]: 2025-10-02 12:43:25.805 2 DEBUG oslo_concurrency.lockutils [req-b2ee8171-fd01-4aa8-b030-edcf4acd287e req-341cba1c-ddfc-4b2f-8398-7e667c65021a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:25 np0005466012 nova_compute[192063]: 2025-10-02 12:43:25.806 2 DEBUG oslo_concurrency.lockutils [req-b2ee8171-fd01-4aa8-b030-edcf4acd287e req-341cba1c-ddfc-4b2f-8398-7e667c65021a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:25 np0005466012 nova_compute[192063]: 2025-10-02 12:43:25.806 2 DEBUG nova.compute.manager [req-b2ee8171-fd01-4aa8-b030-edcf4acd287e req-341cba1c-ddfc-4b2f-8398-7e667c65021a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] No waiting events found dispatching network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:25 np0005466012 nova_compute[192063]: 2025-10-02 12:43:25.807 2 WARNING nova.compute.manager [req-b2ee8171-fd01-4aa8-b030-edcf4acd287e req-341cba1c-ddfc-4b2f-8398-7e667c65021a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received unexpected event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e for instance with vm_state active and task_state None.#033[00m
Oct  2 08:43:28 np0005466012 nova_compute[192063]: 2025-10-02 12:43:28.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:29 np0005466012 nova_compute[192063]: 2025-10-02 12:43:29.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:30 np0005466012 podman[250808]: 2025-10-02 12:43:30.1584455 +0000 UTC m=+0.065232484 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:43:30 np0005466012 podman[250809]: 2025-10-02 12:43:30.159602872 +0000 UTC m=+0.059930734 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:43:30 np0005466012 podman[250807]: 2025-10-02 12:43:30.166200197 +0000 UTC m=+0.075406289 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute)
Oct  2 08:43:30 np0005466012 podman[250815]: 2025-10-02 12:43:30.208934118 +0000 UTC m=+0.096806341 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:43:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:31.862 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:31.864 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:43:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:31.865 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:31 np0005466012 nova_compute[192063]: 2025-10-02 12:43:31.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.072 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.072 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.177 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.598 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.599 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.609 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.609 2 INFO nova.compute.claims [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.775 2 DEBUG nova.compute.provider_tree [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.803 2 DEBUG nova.scheduler.client.report [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.854 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:32 np0005466012 nova_compute[192063]: 2025-10-02 12:43:32.855 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.073 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.075 2 DEBUG nova.network.neutron [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.133 2 INFO nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.212 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.474 2 DEBUG nova.policy [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.478 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.479 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.479 2 INFO nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Creating image(s)#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.480 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "/var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.480 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.481 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.493 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.555 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.556 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.556 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.568 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.626 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.627 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.710 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk 1073741824" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.711 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.712 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.769 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.770 2 DEBUG nova.virt.disk.api [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Checking if we can resize image /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.771 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.831 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.832 2 DEBUG nova.virt.disk.api [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Cannot resize image /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.833 2 DEBUG nova.objects.instance [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'migration_context' on Instance uuid 9fb63acc-0415-4cd3-83e3-e90b080ebdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.858 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.859 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Ensure instance console log exists: /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.859 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.860 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:33 np0005466012 nova_compute[192063]: 2025-10-02 12:43:33.860 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:34 np0005466012 nova_compute[192063]: 2025-10-02 12:43:34.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:34 np0005466012 nova_compute[192063]: 2025-10-02 12:43:34.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:35 np0005466012 nova_compute[192063]: 2025-10-02 12:43:35.095 2 DEBUG nova.network.neutron [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Successfully created port: 555081c9-856e-4461-954c-839e380351df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.610 2 DEBUG nova.network.neutron [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Successfully updated port: 555081c9-856e-4461-954c-839e380351df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.636 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.637 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquired lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.637 2 DEBUG nova.network.neutron [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.718 2 DEBUG nova.compute.manager [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-changed-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.718 2 DEBUG nova.compute.manager [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Refreshing instance network info cache due to event network-changed-555081c9-856e-4461-954c-839e380351df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.719 2 DEBUG oslo_concurrency.lockutils [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:36 np0005466012 nova_compute[192063]: 2025-10-02 12:43:36.804 2 DEBUG nova.network.neutron [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.472 2 DEBUG nova.network.neutron [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.491 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Releasing lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.492 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Instance network_info: |[{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.492 2 DEBUG oslo_concurrency.lockutils [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.493 2 DEBUG nova.network.neutron [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Refreshing network info cache for port 555081c9-856e-4461-954c-839e380351df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.495 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Start _get_guest_xml network_info=[{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.501 2 WARNING nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.505 2 DEBUG nova.virt.libvirt.host [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.506 2 DEBUG nova.virt.libvirt.host [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.512 2 DEBUG nova.virt.libvirt.host [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.513 2 DEBUG nova.virt.libvirt.host [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.514 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.514 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.515 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.515 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.515 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.516 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.516 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.516 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.516 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.517 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.517 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.517 2 DEBUG nova.virt.hardware [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.520 2 DEBUG nova.virt.libvirt.vif [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:43:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=179,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBsU7axu6K0ttkyt58Oe7H69HarowC1Z/KycD7YkRVEhtB0exOJVQoE9wmWsvBsBydwCp/7CFEB2lgZnOsTyjcv0xdeMoagqDLiiz0dNIOFUm8UmcCkDJyem1g0M/sbuQA==',key_name='tempest-TestSecurityGroupsBasicOps-1428281449',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-yz7t5j9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:43:33Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=9fb63acc-0415-4cd3-83e3-e90b080ebdce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.521 2 DEBUG nova.network.os_vif_util [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.522 2 DEBUG nova.network.os_vif_util [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.522 2 DEBUG nova.objects.instance [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'pci_devices' on Instance uuid 9fb63acc-0415-4cd3-83e3-e90b080ebdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.537 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <uuid>9fb63acc-0415-4cd3-83e3-e90b080ebdce</uuid>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <name>instance-000000b3</name>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260</nova:name>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:43:37</nova:creationTime>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:user uuid="2d2b4a2da57543ef88e44ae28ad61647">tempest-TestSecurityGroupsBasicOps-1020134341-project-member</nova:user>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:project uuid="575f3d227ab24f2daa62e65e14a4cd9c">tempest-TestSecurityGroupsBasicOps-1020134341</nova:project>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        <nova:port uuid="555081c9-856e-4461-954c-839e380351df">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <entry name="serial">9fb63acc-0415-4cd3-83e3-e90b080ebdce</entry>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <entry name="uuid">9fb63acc-0415-4cd3-83e3-e90b080ebdce</entry>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.config"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:8e:8e:90"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <target dev="tap555081c9-85"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/console.log" append="off"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:43:37 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:43:37 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:43:37 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:43:37 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.539 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Preparing to wait for external event network-vif-plugged-555081c9-856e-4461-954c-839e380351df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.539 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.539 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.539 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.540 2 DEBUG nova.virt.libvirt.vif [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:43:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=179,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBsU7axu6K0ttkyt58Oe7H69HarowC1Z/KycD7YkRVEhtB0exOJVQoE9wmWsvBsBydwCp/7CFEB2lgZnOsTyjcv0xdeMoagqDLiiz0dNIOFUm8UmcCkDJyem1g0M/sbuQA==',key_name='tempest-TestSecurityGroupsBasicOps-1428281449',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-yz7t5j9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:43:33Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=9fb63acc-0415-4cd3-83e3-e90b080ebdce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.540 2 DEBUG nova.network.os_vif_util [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.541 2 DEBUG nova.network.os_vif_util [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.541 2 DEBUG os_vif [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap555081c9-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap555081c9-85, col_values=(('external_ids', {'iface-id': '555081c9-856e-4461-954c-839e380351df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:8e:90', 'vm-uuid': '9fb63acc-0415-4cd3-83e3-e90b080ebdce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:37 np0005466012 NetworkManager[51207]: <info>  [1759409017.5483] manager: (tap555081c9-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.556 2 INFO os_vif [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85')#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.639 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.640 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.641 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No VIF found with MAC fa:16:3e:8e:8e:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:43:37 np0005466012 nova_compute[192063]: 2025-10-02 12:43:37.641 2 INFO nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Using config drive#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.136 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.136 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.137 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.137 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.137 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.148 2 INFO nova.compute.manager [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Terminating instance#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.159 2 DEBUG nova.compute.manager [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.275 2 INFO nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Creating config drive at /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.config#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.285 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2vgf1fya execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:38 np0005466012 kernel: tap252c8b47-de (unregistering): left promiscuous mode
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.3818] device (tap252c8b47-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00709|binding|INFO|Releasing lport 252c8b47-de4a-47c5-a2e3-02b99df6331e from this chassis (sb_readonly=0)
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00710|binding|INFO|Setting lport 252c8b47-de4a-47c5-a2e3-02b99df6331e down in Southbound
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00711|binding|INFO|Removing iface tap252c8b47-de ovn-installed in OVS
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.425 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:f5:fc 10.100.0.3'], port_security=['fa:16:3e:82:f5:fc 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '48f5c1dd-1059-42ea-94ab-15808efc147b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0acd1c52a26d4654b24111e5ad4814f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f9d53e4-02f4-4598-9a8f-67bc82369860', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c1b0270-8f0a-4540-b305-4a4654e80399, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=252c8b47-de4a-47c5-a2e3-02b99df6331e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.426 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 252c8b47-de4a-47c5-a2e3-02b99df6331e in datapath 89c6a9c2-23c1-4b8b-81b9-3050a42a016f unbound from our chassis#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.428 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89c6a9c2-23c1-4b8b-81b9-3050a42a016f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.429 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7c70b4d0-286d-4779-a9e5-8032d03d40c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.430 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f namespace which is not needed anymore#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.432 2 DEBUG oslo_concurrency.processutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2vgf1fya" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:38 np0005466012 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Oct  2 08:43:38 np0005466012 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000b0.scope: Consumed 14.152s CPU time.
Oct  2 08:43:38 np0005466012 systemd-machined[152114]: Machine qemu-79-instance-000000b0 terminated.
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.4931] manager: (tap555081c9-85): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Oct  2 08:43:38 np0005466012 systemd-udevd[250930]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:43:38 np0005466012 kernel: tap555081c9-85: entered promiscuous mode
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00712|binding|INFO|Claiming lport 555081c9-856e-4461-954c-839e380351df for this chassis.
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00713|binding|INFO|555081c9-856e-4461-954c-839e380351df: Claiming fa:16:3e:8e:8e:90 10.100.0.10
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.5081] device (tap555081c9-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.5090] device (tap555081c9-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.511 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:8e:90 10.100.0.10'], port_security=['fa:16:3e:8e:8e:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22233733-d0b0-4cf4-92ea-672ceac870ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a829918-051f-4369-a187-9d7b48de1d0d fcec2fcb-4394-44c1-a474-f605cfffa19d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e315cbae-ba84-4a8d-94d7-bde2402f66e9, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=555081c9-856e-4461-954c-839e380351df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:43:38 np0005466012 systemd-machined[152114]: New machine qemu-80-instance-000000b3.
Oct  2 08:43:38 np0005466012 systemd[1]: Started Virtual Machine qemu-80-instance-000000b3.
Oct  2 08:43:38 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [NOTICE]   (250796) : haproxy version is 2.8.14-c23fe91
Oct  2 08:43:38 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [NOTICE]   (250796) : path to executable is /usr/sbin/haproxy
Oct  2 08:43:38 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [WARNING]  (250796) : Exiting Master process...
Oct  2 08:43:38 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [ALERT]    (250796) : Current worker (250798) exited with code 143 (Terminated)
Oct  2 08:43:38 np0005466012 neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f[250792]: [WARNING]  (250796) : All workers exited. Exiting... (0)
Oct  2 08:43:38 np0005466012 systemd[1]: libpod-b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f.scope: Deactivated successfully.
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 podman[250962]: 2025-10-02 12:43:38.56561332 +0000 UTC m=+0.046311592 container died b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00714|binding|INFO|Setting lport 555081c9-856e-4461-954c-839e380351df ovn-installed in OVS
Oct  2 08:43:38 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:38Z|00715|binding|INFO|Setting lport 555081c9-856e-4461-954c-839e380351df up in Southbound
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.5846] manager: (tap252c8b47-de): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f-userdata-shm.mount: Deactivated successfully.
Oct  2 08:43:38 np0005466012 systemd[1]: var-lib-containers-storage-overlay-af234bf810c7b5fe0c616d0b355a02f7ea2c2f6c9ca1ef018287e118294cb48a-merged.mount: Deactivated successfully.
Oct  2 08:43:38 np0005466012 podman[250962]: 2025-10-02 12:43:38.607856597 +0000 UTC m=+0.088554869 container cleanup b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:43:38 np0005466012 systemd[1]: libpod-conmon-b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f.scope: Deactivated successfully.
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.632 2 INFO nova.virt.libvirt.driver [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Instance destroyed successfully.#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.632 2 DEBUG nova.objects.instance [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lazy-loading 'resources' on Instance uuid 48f5c1dd-1059-42ea-94ab-15808efc147b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.649 2 DEBUG nova.virt.libvirt.vif [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:43:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1676761474',display_name='tempest-TestServerMultinode-server-1676761474',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1676761474',id=176,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:43:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0acd1c52a26d4654b24111e5ad4814f2',ramdisk_id='',reservation_id='r-y5kvxx3i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1539275040',owner_user_name='tempest-TestServerMultinode-1539275040-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:43:23Z,user_data=None,user_id='7ed2a973cfed4867a095aecf0c6453fb',uuid=48f5c1dd-1059-42ea-94ab-15808efc147b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.649 2 DEBUG nova.network.os_vif_util [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converting VIF {"id": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "address": "fa:16:3e:82:f5:fc", "network": {"id": "89c6a9c2-23c1-4b8b-81b9-3050a42a016f", "bridge": "br-int", "label": "tempest-TestServerMultinode-1758818255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a5d17af56da453cb0073e5e2be72803", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252c8b47-de", "ovs_interfaceid": "252c8b47-de4a-47c5-a2e3-02b99df6331e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.650 2 DEBUG nova.network.os_vif_util [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.650 2 DEBUG os_vif [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap252c8b47-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.658 2 INFO os_vif [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:f5:fc,bridge_name='br-int',has_traffic_filtering=True,id=252c8b47-de4a-47c5-a2e3-02b99df6331e,network=Network(89c6a9c2-23c1-4b8b-81b9-3050a42a016f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252c8b47-de')#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.659 2 INFO nova.virt.libvirt.driver [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Deleting instance files /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b_del#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.660 2 INFO nova.virt.libvirt.driver [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Deletion of /var/lib/nova/instances/48f5c1dd-1059-42ea-94ab-15808efc147b_del complete#033[00m
Oct  2 08:43:38 np0005466012 podman[251017]: 2025-10-02 12:43:38.69022414 +0000 UTC m=+0.059646656 container remove b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.702 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5220d537-2fca-4489-8d22-b20326e99e98]: (4, ('Thu Oct  2 12:43:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f (b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f)\nb8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f\nThu Oct  2 12:43:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f (b8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f)\nb8479c6da602fb81aa4fd74a744bf3d2c90a1ab71e4221f7c710f8e72212148f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.704 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d213f8-f832-4aba-84d4-ddcaeac122e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.705 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89c6a9c2-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:38 np0005466012 kernel: tap89c6a9c2-20: left promiscuous mode
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.722 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6d62d9b6-3462-43a6-831b-63e801788b9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.740 2 INFO nova.compute.manager [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.740 2 DEBUG oslo.service.loopingcall [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.741 2 DEBUG nova.compute.manager [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.741 2 DEBUG nova.network.neutron [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.762 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[262b3dd6-0482-4917-ab7b-25677b924c1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.763 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb0bcf8-7507-4f90-b80e-3db35e547bf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.779 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[557cc19a-b917-4a0d-95cd-c4f33016570f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 699410, 'reachable_time': 29958, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251032, 'error': None, 'target': 'ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 systemd[1]: run-netns-ovnmeta\x2d89c6a9c2\x2d23c1\x2d4b8b\x2d81b9\x2d3050a42a016f.mount: Deactivated successfully.
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.781 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89c6a9c2-23c1-4b8b-81b9-3050a42a016f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.781 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[51dfb16e-40e3-4782-8a9c-323ad3776fac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.785 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 555081c9-856e-4461-954c-839e380351df in datapath 22233733-d0b0-4cf4-92ea-672ceac870ca unbound from our chassis#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.787 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22233733-d0b0-4cf4-92ea-672ceac870ca#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.800 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[50caf655-1f41-47f4-ab57-88ed5b360d57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.802 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap22233733-d1 in ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.805 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap22233733-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.805 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3c17b29c-417c-4863-a169-cca784efc966]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.806 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9ddd69-3f2b-460f-b309-87e75a02a5c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.818 2 DEBUG nova.compute.manager [req-8e55ae26-3497-46fa-a1d2-62f6133accb6 req-3958fa6e-2338-4c01-a5dc-5876dd4aefbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-plugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.818 2 DEBUG oslo_concurrency.lockutils [req-8e55ae26-3497-46fa-a1d2-62f6133accb6 req-3958fa6e-2338-4c01-a5dc-5876dd4aefbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.819 2 DEBUG oslo_concurrency.lockutils [req-8e55ae26-3497-46fa-a1d2-62f6133accb6 req-3958fa6e-2338-4c01-a5dc-5876dd4aefbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.819 2 DEBUG oslo_concurrency.lockutils [req-8e55ae26-3497-46fa-a1d2-62f6133accb6 req-3958fa6e-2338-4c01-a5dc-5876dd4aefbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.819 2 DEBUG nova.compute.manager [req-8e55ae26-3497-46fa-a1d2-62f6133accb6 req-3958fa6e-2338-4c01-a5dc-5876dd4aefbb 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Processing event network-vif-plugged-555081c9-856e-4461-954c-839e380351df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.816 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[e5219817-bd9f-46e3-ab12-dcd620e1e7a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.848 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf03e50-8a24-4250-9aa1-7b0d5f557257]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.874 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0232e8-7782-49c9-936f-4207c13b0057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.8809] manager: (tap22233733-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/338)
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.881 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9746ff-37fb-487a-83aa-08e8d3fdd2ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.912 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[695131b0-b730-4f04-8dcf-37946755c154]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.915 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d574eef6-e016-481d-9274-d8a9044d0b5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 NetworkManager[51207]: <info>  [1759409018.9352] device (tap22233733-d0): carrier: link connected
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.939 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[4edb9432-077e-4ba4-bf2f-166a42289d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.942 2 DEBUG nova.network.neutron [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updated VIF entry in instance network info cache for port 555081c9-856e-4461-954c-839e380351df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.943 2 DEBUG nova.network.neutron [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.955 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf41de4-1cd7-4d5e-b6fe-de34cdc0a016]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22233733-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:10:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701254, 'reachable_time': 33907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251058, 'error': None, 'target': 'ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 nova_compute[192063]: 2025-10-02 12:43:38.972 2 DEBUG oslo_concurrency.lockutils [req-32ee8cda-90c2-4e15-a5c0-83b1cb48ac29 req-7def0afc-9f85-47a4-977f-ba8a1e52e00f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.971 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe12ea2-5d6d-40d5-98d8-a20633dc95fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:10f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 701254, 'tstamp': 701254}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251059, 'error': None, 'target': 'ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:38 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:38.994 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[41732ba0-3583-4ab2-a8b3-c07578875777]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22233733-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:10:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701254, 'reachable_time': 33907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251060, 'error': None, 'target': 'ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.022 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[343ce9f5-a942-474b-8415-76009531ad6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.076 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d5fec615-cd63-4c4f-bcf7-61288ea2275f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.077 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22233733-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.077 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.079 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22233733-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:39 np0005466012 kernel: tap22233733-d0: entered promiscuous mode
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:39 np0005466012 NetworkManager[51207]: <info>  [1759409019.0811] manager: (tap22233733-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.084 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22233733-d0, col_values=(('external_ids', {'iface-id': '1603bf36-dd98-4360-86e1-f96da48d6efd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:39 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:39Z|00716|binding|INFO|Releasing lport 1603bf36-dd98-4360-86e1-f96da48d6efd from this chassis (sb_readonly=0)
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.088 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22233733-d0b0-4cf4-92ea-672ceac870ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22233733-d0b0-4cf4-92ea-672ceac870ca.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.089 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5748727b-8482-4441-83c8-6357084c09b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.090 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-22233733-d0b0-4cf4-92ea-672ceac870ca
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/22233733-d0b0-4cf4-92ea-672ceac870ca.pid.haproxy
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 22233733-d0b0-4cf4-92ea-672ceac870ca
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:43:39 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:43:39.090 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca', 'env', 'PROCESS_TAG=haproxy-22233733-d0b0-4cf4-92ea-672ceac870ca', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/22233733-d0b0-4cf4-92ea-672ceac870ca.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.217 2 DEBUG nova.network.neutron [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.234 2 INFO nova.compute.manager [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Took 0.49 seconds to deallocate network for instance.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.285 2 DEBUG nova.compute.manager [req-21acd26a-c795-4f56-bf43-83979817cd41 req-c27b76a4-d1bd-4f62-9028-9af25370c589 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received event network-vif-deleted-252c8b47-de4a-47c5-a2e3-02b99df6331e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.325 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.325 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.384 2 DEBUG nova.compute.provider_tree [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.398 2 DEBUG nova.scheduler.client.report [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.418 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.446 2 INFO nova.scheduler.client.report [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Deleted allocations for instance 48f5c1dd-1059-42ea-94ab-15808efc147b#033[00m
Oct  2 08:43:39 np0005466012 podman[251099]: 2025-10-02 12:43:39.469144481 +0000 UTC m=+0.048739249 container create fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.484 2 DEBUG nova.compute.manager [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received event network-vif-unplugged-252c8b47-de4a-47c5-a2e3-02b99df6331e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.485 2 DEBUG oslo_concurrency.lockutils [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.485 2 DEBUG oslo_concurrency.lockutils [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.485 2 DEBUG oslo_concurrency.lockutils [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.486 2 DEBUG nova.compute.manager [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] No waiting events found dispatching network-vif-unplugged-252c8b47-de4a-47c5-a2e3-02b99df6331e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.486 2 WARNING nova.compute.manager [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received unexpected event network-vif-unplugged-252c8b47-de4a-47c5-a2e3-02b99df6331e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.486 2 DEBUG nova.compute.manager [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.486 2 DEBUG oslo_concurrency.lockutils [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.487 2 DEBUG oslo_concurrency.lockutils [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.487 2 DEBUG oslo_concurrency.lockutils [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.487 2 DEBUG nova.compute.manager [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] No waiting events found dispatching network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.487 2 WARNING nova.compute.manager [req-31e12c78-cafd-4e62-8218-4ef59350fbf8 req-84b2acdc-a45f-4d34-8f33-d77fac85f7dd 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Received unexpected event network-vif-plugged-252c8b47-de4a-47c5-a2e3-02b99df6331e for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:43:39 np0005466012 systemd[1]: Started libpod-conmon-fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248.scope.
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.511 2 DEBUG oslo_concurrency.lockutils [None req-19e15bd4-fc57-4ac1-80be-f88d685d3f1f 7ed2a973cfed4867a095aecf0c6453fb 0acd1c52a26d4654b24111e5ad4814f2 - - default default] Lock "48f5c1dd-1059-42ea-94ab-15808efc147b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.514 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409019.513868, 9fb63acc-0415-4cd3-83e3-e90b080ebdce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.514 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] VM Started (Lifecycle Event)#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.517 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:43:39 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.522 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:43:39 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a2adcff13a07fc879c47b577827924f1a7d83cb24c7f465ad6bf8bc9c9d5fbc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.529 2 INFO nova.virt.libvirt.driver [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Instance spawned successfully.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.530 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.538 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:39 np0005466012 podman[251099]: 2025-10-02 12:43:39.53886457 +0000 UTC m=+0.118459358 container init fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Oct  2 08:43:39 np0005466012 podman[251099]: 2025-10-02 12:43:39.443955664 +0000 UTC m=+0.023550462 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.541 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:43:39 np0005466012 podman[251099]: 2025-10-02 12:43:39.545040223 +0000 UTC m=+0.124634991 container start fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.549 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.550 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.550 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.551 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.551 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.552 2 DEBUG nova.virt.libvirt.driver [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:43:39 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [NOTICE]   (251118) : New worker (251120) forked
Oct  2 08:43:39 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [NOTICE]   (251118) : Loading success.
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.575 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.576 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409019.515327, 9fb63acc-0415-4cd3-83e3-e90b080ebdce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.576 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.599 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.603 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409019.5198963, 9fb63acc-0415-4cd3-83e3-e90b080ebdce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.604 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.634 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.638 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.661 2 INFO nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Took 6.18 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.662 2 DEBUG nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.668 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.841 2 INFO nova.compute.manager [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Took 7.29 seconds to build instance.#033[00m
Oct  2 08:43:39 np0005466012 nova_compute[192063]: 2025-10-02 12:43:39.900 2 DEBUG oslo_concurrency.lockutils [None req-a09f9feb-54de-4d1e-bf7e-0bedce916a94 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:40 np0005466012 nova_compute[192063]: 2025-10-02 12:43:40.897 2 DEBUG nova.compute.manager [req-d4059426-a9f1-46d9-8412-b6c99ade47be req-75a117b2-7bf0-447a-bbb6-0d9f9b0eb171 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-plugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:40 np0005466012 nova_compute[192063]: 2025-10-02 12:43:40.898 2 DEBUG oslo_concurrency.lockutils [req-d4059426-a9f1-46d9-8412-b6c99ade47be req-75a117b2-7bf0-447a-bbb6-0d9f9b0eb171 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:40 np0005466012 nova_compute[192063]: 2025-10-02 12:43:40.898 2 DEBUG oslo_concurrency.lockutils [req-d4059426-a9f1-46d9-8412-b6c99ade47be req-75a117b2-7bf0-447a-bbb6-0d9f9b0eb171 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:40 np0005466012 nova_compute[192063]: 2025-10-02 12:43:40.898 2 DEBUG oslo_concurrency.lockutils [req-d4059426-a9f1-46d9-8412-b6c99ade47be req-75a117b2-7bf0-447a-bbb6-0d9f9b0eb171 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:40 np0005466012 nova_compute[192063]: 2025-10-02 12:43:40.899 2 DEBUG nova.compute.manager [req-d4059426-a9f1-46d9-8412-b6c99ade47be req-75a117b2-7bf0-447a-bbb6-0d9f9b0eb171 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-plugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:43:40 np0005466012 nova_compute[192063]: 2025-10-02 12:43:40.899 2 WARNING nova.compute.manager [req-d4059426-a9f1-46d9-8412-b6c99ade47be req-75a117b2-7bf0-447a-bbb6-0d9f9b0eb171 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received unexpected event network-vif-plugged-555081c9-856e-4461-954c-839e380351df for instance with vm_state active and task_state None.#033[00m
Oct  2 08:43:41 np0005466012 nova_compute[192063]: 2025-10-02 12:43:41.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:43 np0005466012 podman[251129]: 2025-10-02 12:43:43.139682452 +0000 UTC m=+0.054772069 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:43:43 np0005466012 podman[251130]: 2025-10-02 12:43:43.157435351 +0000 UTC m=+0.066410387 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm)
Oct  2 08:43:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:43Z|00717|binding|INFO|Releasing lport 1603bf36-dd98-4360-86e1-f96da48d6efd from this chassis (sb_readonly=0)
Oct  2 08:43:43 np0005466012 nova_compute[192063]: 2025-10-02 12:43:43.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:43 np0005466012 nova_compute[192063]: 2025-10-02 12:43:43.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:43 np0005466012 nova_compute[192063]: 2025-10-02 12:43:43.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:43 np0005466012 NetworkManager[51207]: <info>  [1759409023.9339] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Oct  2 08:43:43 np0005466012 NetworkManager[51207]: <info>  [1759409023.9353] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Oct  2 08:43:43 np0005466012 nova_compute[192063]: 2025-10-02 12:43:43.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:43Z|00718|binding|INFO|Releasing lport 1603bf36-dd98-4360-86e1-f96da48d6efd from this chassis (sb_readonly=0)
Oct  2 08:43:43 np0005466012 nova_compute[192063]: 2025-10-02 12:43:43.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:44 np0005466012 nova_compute[192063]: 2025-10-02 12:43:44.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:44 np0005466012 nova_compute[192063]: 2025-10-02 12:43:44.196 2 DEBUG nova.compute.manager [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-changed-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:43:44 np0005466012 nova_compute[192063]: 2025-10-02 12:43:44.198 2 DEBUG nova.compute.manager [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Refreshing instance network info cache due to event network-changed-555081c9-856e-4461-954c-839e380351df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:43:44 np0005466012 nova_compute[192063]: 2025-10-02 12:43:44.199 2 DEBUG oslo_concurrency.lockutils [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:44 np0005466012 nova_compute[192063]: 2025-10-02 12:43:44.199 2 DEBUG oslo_concurrency.lockutils [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:44 np0005466012 nova_compute[192063]: 2025-10-02 12:43:44.200 2 DEBUG nova.network.neutron [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Refreshing network info cache for port 555081c9-856e-4461-954c-839e380351df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:43:45 np0005466012 nova_compute[192063]: 2025-10-02 12:43:45.489 2 DEBUG nova.network.neutron [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updated VIF entry in instance network info cache for port 555081c9-856e-4461-954c-839e380351df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:43:45 np0005466012 nova_compute[192063]: 2025-10-02 12:43:45.491 2 DEBUG nova.network.neutron [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:45 np0005466012 nova_compute[192063]: 2025-10-02 12:43:45.512 2 DEBUG oslo_concurrency.lockutils [req-b8623d38-4f2e-402d-beee-221561f1b23d req-fb2e0111-af3f-49b0-a830-5f4fccd3142f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:46 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:46Z|00719|binding|INFO|Releasing lport 1603bf36-dd98-4360-86e1-f96da48d6efd from this chassis (sb_readonly=0)
Oct  2 08:43:46 np0005466012 nova_compute[192063]: 2025-10-02 12:43:46.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:46 np0005466012 nova_compute[192063]: 2025-10-02 12:43:46.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:48 np0005466012 podman[251172]: 2025-10-02 12:43:48.145284907 +0000 UTC m=+0.056829677 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:43:48 np0005466012 podman[251171]: 2025-10-02 12:43:48.145732169 +0000 UTC m=+0.059780580 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  2 08:43:48 np0005466012 nova_compute[192063]: 2025-10-02 12:43:48.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:48 np0005466012 nova_compute[192063]: 2025-10-02 12:43:48.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:48 np0005466012 nova_compute[192063]: 2025-10-02 12:43:48.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.857 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.857 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.858 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.858 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.936 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.994 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:52 np0005466012 nova_compute[192063]: 2025-10-02 12:43:52.995 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.076 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.205 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.206 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5538MB free_disk=73.21389389038086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.206 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.206 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.346 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 9fb63acc-0415-4cd3-83e3-e90b080ebdce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.347 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.347 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:43:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:53Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:8e:90 10.100.0.10
Oct  2 08:43:53 np0005466012 ovn_controller[94284]: 2025-10-02T12:43:53Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:8e:90 10.100.0.10
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.395 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.411 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.434 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.434 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.630 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409018.6293612, 48f5c1dd-1059-42ea-94ab-15808efc147b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.631 2 INFO nova.compute.manager [-] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.654 2 DEBUG nova.compute.manager [None req-e03f93e5-8e05-4be4-9a5c-5dd63ba16696 - - - - - -] [instance: 48f5c1dd-1059-42ea-94ab-15808efc147b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:43:53 np0005466012 nova_compute[192063]: 2025-10-02 12:43:53.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:56 np0005466012 nova_compute[192063]: 2025-10-02 12:43:56.433 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:56 np0005466012 nova_compute[192063]: 2025-10-02 12:43:56.433 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:43:56 np0005466012 nova_compute[192063]: 2025-10-02 12:43:56.433 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:43:57 np0005466012 nova_compute[192063]: 2025-10-02 12:43:57.432 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:43:57 np0005466012 nova_compute[192063]: 2025-10-02 12:43:57.432 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:43:57 np0005466012 nova_compute[192063]: 2025-10-02 12:43:57.433 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:43:57 np0005466012 nova_compute[192063]: 2025-10-02 12:43:57.433 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9fb63acc-0415-4cd3-83e3-e90b080ebdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:43:58 np0005466012 nova_compute[192063]: 2025-10-02 12:43:58.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:58 np0005466012 nova_compute[192063]: 2025-10-02 12:43:58.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:43:59 np0005466012 nova_compute[192063]: 2025-10-02 12:43:59.661 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:43:59 np0005466012 nova_compute[192063]: 2025-10-02 12:43:59.721 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:43:59 np0005466012 nova_compute[192063]: 2025-10-02 12:43:59.722 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:43:59 np0005466012 nova_compute[192063]: 2025-10-02 12:43:59.722 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:43:59 np0005466012 nova_compute[192063]: 2025-10-02 12:43:59.723 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:44:01 np0005466012 podman[251248]: 2025-10-02 12:44:01.171445639 +0000 UTC m=+0.062635505 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:44:01 np0005466012 podman[251247]: 2025-10-02 12:44:01.181542724 +0000 UTC m=+0.073333966 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 08:44:01 np0005466012 podman[251246]: 2025-10-02 12:44:01.182218252 +0000 UTC m=+0.077028567 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:44:01 np0005466012 podman[251249]: 2025-10-02 12:44:01.20161756 +0000 UTC m=+0.089367042 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:44:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:44:02.160 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:44:02.160 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:44:02.161 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:03 np0005466012 nova_compute[192063]: 2025-10-02 12:44:03.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:03 np0005466012 nova_compute[192063]: 2025-10-02 12:44:03.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:08 np0005466012 nova_compute[192063]: 2025-10-02 12:44:08.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:08 np0005466012 nova_compute[192063]: 2025-10-02 12:44:08.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:09 np0005466012 nova_compute[192063]: 2025-10-02 12:44:09.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:44:10.024 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:44:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:44:10.025 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:44:10 np0005466012 nova_compute[192063]: 2025-10-02 12:44:10.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:12 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:44:12.027 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:44:12 np0005466012 nova_compute[192063]: 2025-10-02 12:44:12.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:13 np0005466012 nova_compute[192063]: 2025-10-02 12:44:13.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:13 np0005466012 nova_compute[192063]: 2025-10-02 12:44:13.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:14 np0005466012 podman[251330]: 2025-10-02 12:44:14.152558969 +0000 UTC m=+0.062986404 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350)
Oct  2 08:44:14 np0005466012 podman[251329]: 2025-10-02 12:44:14.165840701 +0000 UTC m=+0.073732957 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:44:18 np0005466012 nova_compute[192063]: 2025-10-02 12:44:18.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:18 np0005466012 nova_compute[192063]: 2025-10-02 12:44:18.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:19 np0005466012 podman[251371]: 2025-10-02 12:44:19.147548072 +0000 UTC m=+0.057452144 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:44:19 np0005466012 podman[251372]: 2025-10-02 12:44:19.178570656 +0000 UTC m=+0.081908098 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:44:23 np0005466012 nova_compute[192063]: 2025-10-02 12:44:23.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:23 np0005466012 nova_compute[192063]: 2025-10-02 12:44:23.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:28 np0005466012 nova_compute[192063]: 2025-10-02 12:44:28.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:28 np0005466012 nova_compute[192063]: 2025-10-02 12:44:28.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:32 np0005466012 podman[251413]: 2025-10-02 12:44:32.155774211 +0000 UTC m=+0.072534893 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  2 08:44:32 np0005466012 podman[251414]: 2025-10-02 12:44:32.15831892 +0000 UTC m=+0.074413425 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:44:32 np0005466012 podman[251415]: 2025-10-02 12:44:32.173535305 +0000 UTC m=+0.086506015 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:44:32 np0005466012 podman[251416]: 2025-10-02 12:44:32.184587925 +0000 UTC m=+0.096620499 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:44:33 np0005466012 nova_compute[192063]: 2025-10-02 12:44:33.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:33 np0005466012 nova_compute[192063]: 2025-10-02 12:44:33.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:36 np0005466012 nova_compute[192063]: 2025-10-02 12:44:36.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:38 np0005466012 nova_compute[192063]: 2025-10-02 12:44:38.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:38 np0005466012 nova_compute[192063]: 2025-10-02 12:44:38.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:39 np0005466012 nova_compute[192063]: 2025-10-02 12:44:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:41 np0005466012 nova_compute[192063]: 2025-10-02 12:44:41.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:43 np0005466012 nova_compute[192063]: 2025-10-02 12:44:43.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:43 np0005466012 nova_compute[192063]: 2025-10-02 12:44:43.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:45 np0005466012 podman[251496]: 2025-10-02 12:44:45.128751002 +0000 UTC m=+0.050612888 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:44:45 np0005466012 podman[251497]: 2025-10-02 12:44:45.138623341 +0000 UTC m=+0.056498898 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git)
Oct  2 08:44:48 np0005466012 nova_compute[192063]: 2025-10-02 12:44:48.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:48 np0005466012 nova_compute[192063]: 2025-10-02 12:44:48.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:48 np0005466012 nova_compute[192063]: 2025-10-02 12:44:48.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:49 np0005466012 nova_compute[192063]: 2025-10-02 12:44:49.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:49 np0005466012 nova_compute[192063]: 2025-10-02 12:44:49.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:49 np0005466012 nova_compute[192063]: 2025-10-02 12:44:49.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:44:50 np0005466012 podman[251539]: 2025-10-02 12:44:50.133891671 +0000 UTC m=+0.050772502 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:44:50 np0005466012 podman[251538]: 2025-10-02 12:44:50.146827634 +0000 UTC m=+0.064008673 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.847 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.874 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.874 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.875 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.875 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.928 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.990 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:53 np0005466012 nova_compute[192063]: 2025-10-02 12:44:53.991 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.048 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.195 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.196 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5551MB free_disk=73.21389389038086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.197 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.197 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.254 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance 9fb63acc-0415-4cd3-83e3-e90b080ebdce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.254 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.254 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.301 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.325 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.326 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:44:54 np0005466012 nova_compute[192063]: 2025-10-02 12:44:54.327 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:44:55 np0005466012 nova_compute[192063]: 2025-10-02 12:44:55.302 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:56 np0005466012 nova_compute[192063]: 2025-10-02 12:44:56.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:56 np0005466012 nova_compute[192063]: 2025-10-02 12:44:56.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:44:56 np0005466012 nova_compute[192063]: 2025-10-02 12:44:56.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:44:57 np0005466012 nova_compute[192063]: 2025-10-02 12:44:57.450 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:44:57 np0005466012 nova_compute[192063]: 2025-10-02 12:44:57.450 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquired lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:44:57 np0005466012 nova_compute[192063]: 2025-10-02 12:44:57.451 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  2 08:44:57 np0005466012 nova_compute[192063]: 2025-10-02 12:44:57.451 2 DEBUG nova.objects.instance [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9fb63acc-0415-4cd3-83e3-e90b080ebdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.806 2 DEBUG nova.network.neutron [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.831 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Releasing lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.831 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.832 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:44:58 np0005466012 nova_compute[192063]: 2025-10-02 12:44:58.832 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:45:01 np0005466012 nova_compute[192063]: 2025-10-02 12:45:01.828 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:02.161 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:02.161 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:02.162 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:02 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:02Z|00720|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  2 08:45:03 np0005466012 podman[251591]: 2025-10-02 12:45:03.14062623 +0000 UTC m=+0.055373727 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:45:03 np0005466012 podman[251590]: 2025-10-02 12:45:03.161825647 +0000 UTC m=+0.081308362 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:45:03 np0005466012 podman[251592]: 2025-10-02 12:45:03.1678071 +0000 UTC m=+0.080864731 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:45:03 np0005466012 podman[251589]: 2025-10-02 12:45:03.170144714 +0000 UTC m=+0.090481523 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:45:03 np0005466012 nova_compute[192063]: 2025-10-02 12:45:03.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:03 np0005466012 nova_compute[192063]: 2025-10-02 12:45:03.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.368 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.404 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Triggering sync for uuid 9fb63acc-0415-4cd3-83e3-e90b080ebdce _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.404 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.405 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.432 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:08.782 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:08 np0005466012 nova_compute[192063]: 2025-10-02 12:45:08.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:08 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:08.783 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:45:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:10.784 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:12 np0005466012 nova_compute[192063]: 2025-10-02 12:45:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:12 np0005466012 nova_compute[192063]: 2025-10-02 12:45:12.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:45:12 np0005466012 nova_compute[192063]: 2025-10-02 12:45:12.899 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:45:13 np0005466012 nova_compute[192063]: 2025-10-02 12:45:13.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:13 np0005466012 nova_compute[192063]: 2025-10-02 12:45:13.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:16 np0005466012 podman[251677]: 2025-10-02 12:45:16.13388382 +0000 UTC m=+0.048772697 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, name=ubi9-minimal)
Oct  2 08:45:16 np0005466012 podman[251676]: 2025-10-02 12:45:16.1449249 +0000 UTC m=+0.052817767 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.929 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b3', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'hostId': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.948 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.read.requests volume: 1109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.949 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '998aef14-fbea-4ddf-b367-36e3f9a8b15f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1109, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:16.929836', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5cee086-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': 'bd829832cb9321c66e84bfb81d0d6ef97847add779028d179d5d8566cb546d1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:16.929836', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5ceeacc-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '43ba10144f879f4075c4b6ee2dcc68ee61927eea87c4e560811e3dda96862cd2'}]}, 'timestamp': '2025-10-02 12:45:16.949273', '_unique_id': '82af9fde987342b688f9df48d5b27625'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.950 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.962 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.962 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3b7660a-5fc4-48e1-bc5d-868fbf30166e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:16.951092', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d0fac4-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.627456851, 'message_signature': '1429ece861385765a2be1bc8148e042f160e3d5265708c80458bbd255c0ef569'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:16.951092', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d10532-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.627456851, 'message_signature': '39e69870b67785cf75e23b3b08be7dd857d2a1b87c763800969e812d13c0ed3c'}]}, 'timestamp': '2025-10-02 12:45:16.963047', '_unique_id': 'd34a6c0cc9764c43b42936577a65c9eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.963 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.964 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.964 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4dae7d1-fe9f-48b4-a413-a2495e0d9b99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:16.964405', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d142ae-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.627456851, 'message_signature': '1edd768d2abaa65c33b4523ad72edceda6f7b052ebe1eede2d780575856c5489'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:16.964405', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d14b50-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.627456851, 'message_signature': '0cfaf37dd8f3cc448084b8a0febe7a41e8381c5879873d0c710790840c230dbd'}]}, 'timestamp': '2025-10-02 12:45:16.964840', '_unique_id': '4737d1769eca48b593cf94da59359a0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.968 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9fb63acc-0415-4cd3-83e3-e90b080ebdce / tap555081c9-85 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.968 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f33532f-e2ff-4bfb-bda9-0cec5ed7895f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:16.966011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d1ef88-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': '0153630f98e3ce15537d8c6426bc19988c16fd79e4ebdf6d2ba14ae934226f83'}]}, 'timestamp': '2025-10-02 12:45:16.969149', '_unique_id': 'fd5a3a75127a4d0eb47987856fa05f40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.970 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.970 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.write.latency volume: 31123049574 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.970 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6e5718f-f8b8-4944-883d-b194490a7998', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31123049574, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:16.970588', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d23510-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': 'f966055369da0c8744fb62a988f53ddb64791e5e606252c02834ea934fc89e87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:16.970588', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d23d08-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '8d8048879e07802d20dc816a7f9a6dd7c24cf7fdcd9a4fd222429f42f6b119ee'}]}, 'timestamp': '2025-10-02 12:45:16.971024', '_unique_id': 'f546b3ea580d4242ab94cdcde7a29fa1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.971 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27283319-50e7-4801-ab0a-b48718288d9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:16.972164', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d271d8-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': 'bce504d08b6a1553c57fd87d12b4894ac8d20c51cf82daa53e7a5138ec7c7d3d'}]}, 'timestamp': '2025-10-02 12:45:16.972424', '_unique_id': 'a4023a9460ab4446b4a00de581603089'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.972 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.973 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.973 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>]
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.973 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.992 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/cpu volume: 11270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd69e7bec-c7c7-4dda-b198-7d3c38a7fadb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11270000000, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'timestamp': '2025-10-02T12:45:16.973983', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a5d58d96-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.668661193, 'message_signature': 'ab7e9560d8e0dd33648e4ff0b5c85a0a47fa081cd336580be3736b8ba177ea4b'}]}, 'timestamp': '2025-10-02 12:45:16.992866', '_unique_id': '5546c95ffb9a4e7da1ca4a15f00987e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.993 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.994 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.994 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '569b6da2-6c5c-4dd5-b8c5-07ad7f4c3ef9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:16.994408', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d5d86e-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': 'e2f8850a6fed056665b95917231f318c0ab41e8c31650e6f8b5bfa177a59edde'}]}, 'timestamp': '2025-10-02 12:45:16.994754', '_unique_id': '5b17844f12c3439891389e77fd0ec819'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.995 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.996 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.996 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>]
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.996 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad35a6e8-c244-4bf9-9e0d-37db809acfba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:16.996639', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d630a2-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': '0f06f97b0135be2723ea9276379e0d2f12df76dabc9ba179707007c8acf929e2'}]}, 'timestamp': '2025-10-02 12:45:16.996982', '_unique_id': '81c8e9322c794d509a370f9d1d8a7232'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.997 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.998 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.write.requests volume: 351 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.998 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b53309ab-31fa-4f6d-a8cb-c36ce259ef12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 351, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:16.998584', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d67c42-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': 'f232ee16a683c8f41ad6bf2c9d40a33ab943727ece5b570f08ac0b8936855fee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:16.998584', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d6878c-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '290842936dd964ea48c4e29b8e187351e9e38092aa3fd98009890bb2afc6ed81'}]}, 'timestamp': '2025-10-02 12:45:16.999193', '_unique_id': '98f6bbb43fca4a09ae26f440b3ad0997'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:16.999 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.000 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.read.latency volume: 582810734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.read.latency volume: 37455057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af8a5ee0-2dd5-421e-8bdb-59a0d5e82664', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 582810734, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:17.000679', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d6ce40-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '317c79ba15302d21be3262ca45865fde27de0f790d5892c29cf7b0aad86c3236'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37455057, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:17.000679', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d6d976-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '2dbcddab84080594c11f7c9273cfbc0c6267801a612c44995c2ff63f01cc4c7b'}]}, 'timestamp': '2025-10-02 12:45:17.001287', '_unique_id': 'c2a714e804f8497f809c4bf9020c4a08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.001 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.002 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.incoming.packets volume: 101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '713c2788-f9be-491d-bf90-d257e39a36fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 101, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:17.002810', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d7207a-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': '06b4f5b693652716196a3db6e41ecf70175a8e2ea9f2fa69b40d347421695f93'}]}, 'timestamp': '2025-10-02 12:45:17.003121', '_unique_id': 'd5310fe2504e4468af11503dc35f1ae0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.004 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.004 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '155f4a15-dadb-4038-8a65-e78ccc83655d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:17.004563', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d76594-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.627456851, 'message_signature': '1160516c40d13ca8f1e57440ce6d495efaf776e46ba28b2ad355e50a67ec9d9e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:17.004563', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d770ca-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.627456851, 'message_signature': 'ad26b9a7da6107149b5eda16d1e78f01b72b2c4c37df514d54cbdd434fe3acbd'}]}, 'timestamp': '2025-10-02 12:45:17.005163', '_unique_id': '9c56f3ccf892494195648734f9b2673f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.005 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.006 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.incoming.bytes volume: 18894 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b0029a2-d1ab-4a02-9791-a815cb4a05f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18894, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:17.006657', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d7b8a0-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': 'dbd5f46cd50537b60568a9393731b61406e30f8128587b98c4347bb8fb726773'}]}, 'timestamp': '2025-10-02 12:45:17.007017', '_unique_id': '479103bbfe36477da3e53c4910f4bb1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.007 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.008 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.outgoing.packets volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0cad098-f360-4e0f-8a2c-6d8c2e657126', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 108, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:17.008454', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d7fcca-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': 'dbc71468cb772738667eb3fb8ce14648c0758779f48f9e399f869775f60733c1'}]}, 'timestamp': '2025-10-02 12:45:17.008784', '_unique_id': 'd1f6cdaf9a5e4da0962c051f5d16775f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.010 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.010 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>]
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.010 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.010 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/memory.usage volume: 42.484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4a4e431-918f-4834-8b52-ec22ed638ff9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.484375, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'timestamp': '2025-10-02T12:45:17.010755', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a5d85724-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.668661193, 'message_signature': '2beda6180b7bc86e1aadc3a94fd8d98f92b910f12068a5598e91355e53cd6b36'}]}, 'timestamp': '2025-10-02 12:45:17.011063', '_unique_id': 'b722d2a9a34b4cda9d31130b37675969'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.012 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10241c93-be06-4f99-8edf-03711da77de0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:17.012489', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d89a72-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': '55a3a81dd39e8186e4750190a8bea9f94483cff5154b5d583adda621a6874c83'}]}, 'timestamp': '2025-10-02 12:45:17.012821', '_unique_id': '9ee15f10393a4f69b8cb5fd06d040484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.014 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.read.bytes volume: 30837248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.014 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cc6ce93-7dfe-4f08-af74-e25290f605b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30837248, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:17.014279', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d8e040-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '30c4e18734a353c1f932c63927a7d9e6c2e8a9f5331b12cd99b13b23e919d5ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:17.014279', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d8ec20-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': 'a43afacc5511b5396fdb98e29060b53d8fd48516f34614fd79307d430063cd9b'}]}, 'timestamp': '2025-10-02 12:45:17.014875', '_unique_id': '04254db7140f4e7ea003d021d1118f59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.016 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.016 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260>]
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.016 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.outgoing.bytes volume: 15440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '168d298a-7eec-4666-af05-5a4ff24f18c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15440, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:17.016925', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d94800-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': '29aee395d400ba1ef06eb2f5261914f3693a6c617c41b3cdaa32cb6c14660aa9'}]}, 'timestamp': '2025-10-02 12:45:17.017239', '_unique_id': '3c59d0ee659043d7856c060b15a2e916'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.017 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.018 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e44ee15-5d68-4de4-822f-5270de1938b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': 'instance-000000b3-9fb63acc-0415-4cd3-83e3-e90b080ebdce-tap555081c9-85', 'timestamp': '2025-10-02T12:45:17.018690', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'tap555081c9-85', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:8e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap555081c9-85'}, 'message_id': 'a5d98dba-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.642385667, 'message_signature': '2603d9c02f561f82eb8336b1526b43d3f082136796da3f8aa19d76ea61d58008'}]}, 'timestamp': '2025-10-02 12:45:17.019025', '_unique_id': 'c164d57b8ba64aab9498d5c31e6333ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.019 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.020 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.020 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.write.bytes volume: 73121792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.020 12 DEBUG ceilometer.compute.pollsters [-] 9fb63acc-0415-4cd3-83e3-e90b080ebdce/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5111c94-3729-499e-ae5a-73dd8b08a003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73121792, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-vda', 'timestamp': '2025-10-02T12:45:17.020457', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d9d194-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': '3f80f04f81c1e613d729a3d9ead21cc0faa726cb3ffac43c22ddc1293604e3d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_name': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_name': None, 'resource_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce-sda', 'timestamp': '2025-10-02T12:45:17.020457', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260', 'name': 'instance-000000b3', 'instance_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'instance_type': 'm1.nano', 'host': '5504e26544a3f6f2cc2fd19ef74ad7b2f199c2e081485bca1707b6f5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d9ddd8-9f8d-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7110.606190123, 'message_signature': 'd9e77ea0ef8c79d25be5ea51cec036c8afa0b9feceeb9627b55d949e1c9a98d2'}]}, 'timestamp': '2025-10-02 12:45:17.021060', '_unique_id': 'd1510c9819fc4ff58de6b07a25dc5dff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:45:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:45:17.021 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:45:18 np0005466012 nova_compute[192063]: 2025-10-02 12:45:18.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005466012 nova_compute[192063]: 2025-10-02 12:45:18.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:18 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:18Z|00721|binding|INFO|Releasing lport 1603bf36-dd98-4360-86e1-f96da48d6efd from this chassis (sb_readonly=0)
Oct  2 08:45:18 np0005466012 nova_compute[192063]: 2025-10-02 12:45:18.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.739 2 DEBUG nova.compute.manager [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-changed-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.739 2 DEBUG nova.compute.manager [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Refreshing instance network info cache due to event network-changed-555081c9-856e-4461-954c-839e380351df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.740 2 DEBUG oslo_concurrency.lockutils [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.740 2 DEBUG oslo_concurrency.lockutils [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.740 2 DEBUG nova.network.neutron [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Refreshing network info cache for port 555081c9-856e-4461-954c-839e380351df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.873 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.873 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.873 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.874 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.874 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.885 2 INFO nova.compute.manager [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Terminating instance#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.899 2 DEBUG nova.compute.manager [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:45:20 np0005466012 kernel: tap555081c9-85 (unregistering): left promiscuous mode
Oct  2 08:45:20 np0005466012 NetworkManager[51207]: <info>  [1759409120.9205] device (tap555081c9-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:45:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:20Z|00722|binding|INFO|Releasing lport 555081c9-856e-4461-954c-839e380351df from this chassis (sb_readonly=0)
Oct  2 08:45:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:20Z|00723|binding|INFO|Setting lport 555081c9-856e-4461-954c-839e380351df down in Southbound
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:20 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:20Z|00724|binding|INFO|Removing iface tap555081c9-85 ovn-installed in OVS
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:20.936 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:8e:90 10.100.0.10'], port_security=['fa:16:3e:8e:8e:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22233733-d0b0-4cf4-92ea-672ceac870ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a829918-051f-4369-a187-9d7b48de1d0d fcec2fcb-4394-44c1-a474-f605cfffa19d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e315cbae-ba84-4a8d-94d7-bde2402f66e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=555081c9-856e-4461-954c-839e380351df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:20.938 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 555081c9-856e-4461-954c-839e380351df in datapath 22233733-d0b0-4cf4-92ea-672ceac870ca unbound from our chassis#033[00m
Oct  2 08:45:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:20.939 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22233733-d0b0-4cf4-92ea-672ceac870ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:45:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:20.942 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d30ee8-2973-4c59-a4c6-8a3e9c4f3f66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:20.943 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca namespace which is not needed anymore#033[00m
Oct  2 08:45:20 np0005466012 nova_compute[192063]: 2025-10-02 12:45:20.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:20 np0005466012 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Oct  2 08:45:20 np0005466012 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b3.scope: Consumed 16.006s CPU time.
Oct  2 08:45:20 np0005466012 systemd-machined[152114]: Machine qemu-80-instance-000000b3 terminated.
Oct  2 08:45:21 np0005466012 podman[251721]: 2025-10-02 12:45:21.008983281 +0000 UTC m=+0.063991061 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:45:21 np0005466012 podman[251717]: 2025-10-02 12:45:21.009680881 +0000 UTC m=+0.068255028 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Oct  2 08:45:21 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [NOTICE]   (251118) : haproxy version is 2.8.14-c23fe91
Oct  2 08:45:21 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [NOTICE]   (251118) : path to executable is /usr/sbin/haproxy
Oct  2 08:45:21 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [WARNING]  (251118) : Exiting Master process...
Oct  2 08:45:21 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [WARNING]  (251118) : Exiting Master process...
Oct  2 08:45:21 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [ALERT]    (251118) : Current worker (251120) exited with code 143 (Terminated)
Oct  2 08:45:21 np0005466012 neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca[251114]: [WARNING]  (251118) : All workers exited. Exiting... (0)
Oct  2 08:45:21 np0005466012 systemd[1]: libpod-fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248.scope: Deactivated successfully.
Oct  2 08:45:21 np0005466012 podman[251779]: 2025-10-02 12:45:21.075961263 +0000 UTC m=+0.041618313 container died fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:45:21 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248-userdata-shm.mount: Deactivated successfully.
Oct  2 08:45:21 np0005466012 systemd[1]: var-lib-containers-storage-overlay-6a2adcff13a07fc879c47b577827924f1a7d83cb24c7f465ad6bf8bc9c9d5fbc-merged.mount: Deactivated successfully.
Oct  2 08:45:21 np0005466012 kernel: tap555081c9-85: entered promiscuous mode
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00725|binding|INFO|Claiming lport 555081c9-856e-4461-954c-839e380351df for this chassis.
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00726|binding|INFO|555081c9-856e-4461-954c-839e380351df: Claiming fa:16:3e:8e:8e:90 10.100.0.10
Oct  2 08:45:21 np0005466012 kernel: tap555081c9-85 (unregistering): left promiscuous mode
Oct  2 08:45:21 np0005466012 podman[251779]: 2025-10-02 12:45:21.12140154 +0000 UTC m=+0.087058570 container cleanup fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:45:21 np0005466012 NetworkManager[51207]: <info>  [1759409121.1230] manager: (tap555081c9-85): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.124 2 DEBUG nova.compute.manager [req-0334bb89-0ab1-4ab2-bf09-1792cc728965 req-a2dd2883-6bf0-4dbf-aaee-321ce3da3e90 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-unplugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.124 2 DEBUG oslo_concurrency.lockutils [req-0334bb89-0ab1-4ab2-bf09-1792cc728965 req-a2dd2883-6bf0-4dbf-aaee-321ce3da3e90 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.125 2 DEBUG oslo_concurrency.lockutils [req-0334bb89-0ab1-4ab2-bf09-1792cc728965 req-a2dd2883-6bf0-4dbf-aaee-321ce3da3e90 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.125 2 DEBUG oslo_concurrency.lockutils [req-0334bb89-0ab1-4ab2-bf09-1792cc728965 req-a2dd2883-6bf0-4dbf-aaee-321ce3da3e90 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.125 2 DEBUG nova.compute.manager [req-0334bb89-0ab1-4ab2-bf09-1792cc728965 req-a2dd2883-6bf0-4dbf-aaee-321ce3da3e90 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-unplugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.125 2 DEBUG nova.compute.manager [req-0334bb89-0ab1-4ab2-bf09-1792cc728965 req-a2dd2883-6bf0-4dbf-aaee-321ce3da3e90 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-unplugged-555081c9-856e-4461-954c-839e380351df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.129 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:8e:90 10.100.0.10'], port_security=['fa:16:3e:8e:8e:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22233733-d0b0-4cf4-92ea-672ceac870ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a829918-051f-4369-a187-9d7b48de1d0d fcec2fcb-4394-44c1-a474-f605cfffa19d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e315cbae-ba84-4a8d-94d7-bde2402f66e9, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=555081c9-856e-4461-954c-839e380351df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:21 np0005466012 systemd[1]: libpod-conmon-fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248.scope: Deactivated successfully.
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00727|binding|INFO|Setting lport 555081c9-856e-4461-954c-839e380351df ovn-installed in OVS
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00728|binding|INFO|Setting lport 555081c9-856e-4461-954c-839e380351df up in Southbound
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00729|binding|INFO|Releasing lport 555081c9-856e-4461-954c-839e380351df from this chassis (sb_readonly=1)
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00730|if_status|INFO|Dropped 1 log messages in last 1393 seconds (most recently, 1393 seconds ago) due to excessive rate
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00731|if_status|INFO|Not setting lport 555081c9-856e-4461-954c-839e380351df down as sb is readonly
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00732|binding|INFO|Removing iface tap555081c9-85 ovn-installed in OVS
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00733|binding|INFO|Releasing lport 555081c9-856e-4461-954c-839e380351df from this chassis (sb_readonly=0)
Oct  2 08:45:21 np0005466012 ovn_controller[94284]: 2025-10-02T12:45:21Z|00734|binding|INFO|Setting lport 555081c9-856e-4461-954c-839e380351df down in Southbound
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.149 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:8e:90 10.100.0.10'], port_security=['fa:16:3e:8e:8e:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9fb63acc-0415-4cd3-83e3-e90b080ebdce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22233733-d0b0-4cf4-92ea-672ceac870ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a829918-051f-4369-a187-9d7b48de1d0d fcec2fcb-4394-44c1-a474-f605cfffa19d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e315cbae-ba84-4a8d-94d7-bde2402f66e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=555081c9-856e-4461-954c-839e380351df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.187 2 INFO nova.virt.libvirt.driver [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Instance destroyed successfully.#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.189 2 DEBUG nova.objects.instance [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'resources' on Instance uuid 9fb63acc-0415-4cd3-83e3-e90b080ebdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:45:21 np0005466012 podman[251812]: 2025-10-02 12:45:21.195845465 +0000 UTC m=+0.047218856 container remove fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.204 2 DEBUG nova.virt.libvirt.vif [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:43:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-440768260',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=179,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBsU7axu6K0ttkyt58Oe7H69HarowC1Z/KycD7YkRVEhtB0exOJVQoE9wmWsvBsBydwCp/7CFEB2lgZnOsTyjcv0xdeMoagqDLiiz0dNIOFUm8UmcCkDJyem1g0M/sbuQA==',key_name='tempest-TestSecurityGroupsBasicOps-1428281449',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:43:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-yz7t5j9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:43:39Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=9fb63acc-0415-4cd3-83e3-e90b080ebdce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.204 2 DEBUG nova.network.os_vif_util [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.203 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[44d66b71-91e6-4692-836b-e9544e01f7ee]: (4, ('Thu Oct  2 12:45:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca (fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248)\nfdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248\nThu Oct  2 12:45:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca (fdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248)\nfdf8543ea434db23f5d2b5d78c7877622a73eafbead466ee0063c0ce058d8248\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.205 2 DEBUG nova.network.os_vif_util [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.206 2 DEBUG os_vif [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.205 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8fc917-355d-4363-88f0-855aa796c79e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.206 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22233733-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 kernel: tap22233733-d0: left promiscuous mode
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.210 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap555081c9-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.226 2 INFO os_vif [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:8e:90,bridge_name='br-int',has_traffic_filtering=True,id=555081c9-856e-4461-954c-839e380351df,network=Network(22233733-d0b0-4cf4-92ea-672ceac870ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap555081c9-85')#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.227 2 INFO nova.virt.libvirt.driver [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Deleting instance files /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce_del#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.226 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3685f71c-0e49-4f3b-b08b-683cb5b0c1ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.227 2 INFO nova.virt.libvirt.driver [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Deletion of /var/lib/nova/instances/9fb63acc-0415-4cd3-83e3-e90b080ebdce_del complete#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.252 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0c87f2-76c4-4ec2-b720-7521c1f5c7b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.253 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f971330a-727f-4e8e-bb61-ac1364db311b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.269 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[af1e4ed4-0af4-4e60-8772-a829c2f344b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701248, 'reachable_time': 27300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251840, 'error': None, 'target': 'ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.275 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-22233733-d0b0-4cf4-92ea-672ceac870ca deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.275 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6eff25-776e-4933-b0b1-81a41a4dd07e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 systemd[1]: run-netns-ovnmeta\x2d22233733\x2dd0b0\x2d4cf4\x2d92ea\x2d672ceac870ca.mount: Deactivated successfully.
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.277 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 555081c9-856e-4461-954c-839e380351df in datapath 22233733-d0b0-4cf4-92ea-672ceac870ca unbound from our chassis#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.277 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22233733-d0b0-4cf4-92ea-672ceac870ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.278 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[2efcb5d9-e779-4e7a-9d6e-acc87cf26f4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.279 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 555081c9-856e-4461-954c-839e380351df in datapath 22233733-d0b0-4cf4-92ea-672ceac870ca unbound from our chassis#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.280 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22233733-d0b0-4cf4-92ea-672ceac870ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:45:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:45:21.280 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[d79bf925-e3fd-486a-b1bd-1eb16e4af097]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.294 2 INFO nova.compute.manager [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.295 2 DEBUG oslo.service.loopingcall [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.295 2 DEBUG nova.compute.manager [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:45:21 np0005466012 nova_compute[192063]: 2025-10-02 12:45:21.295 2 DEBUG nova.network.neutron [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.030 2 DEBUG nova.network.neutron [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.051 2 INFO nova.compute.manager [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Took 0.76 seconds to deallocate network for instance.#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.138 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.139 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.239 2 DEBUG nova.compute.provider_tree [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.264 2 DEBUG nova.scheduler.client.report [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.287 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.346 2 INFO nova.scheduler.client.report [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Deleted allocations for instance 9fb63acc-0415-4cd3-83e3-e90b080ebdce#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.435 2 DEBUG oslo_concurrency.lockutils [None req-0cf3ac0e-702c-489b-bdd1-c171575a86c4 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.550 2 DEBUG nova.network.neutron [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updated VIF entry in instance network info cache for port 555081c9-856e-4461-954c-839e380351df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.551 2 DEBUG nova.network.neutron [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Updating instance_info_cache with network_info: [{"id": "555081c9-856e-4461-954c-839e380351df", "address": "fa:16:3e:8e:8e:90", "network": {"id": "22233733-d0b0-4cf4-92ea-672ceac870ca", "bridge": "br-int", "label": "tempest-network-smoke--2002423694", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap555081c9-85", "ovs_interfaceid": "555081c9-856e-4461-954c-839e380351df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.569 2 DEBUG oslo_concurrency.lockutils [req-a00f19f4-984d-4649-8c96-a944bc9e2164 req-792b1ca4-a46a-4a1e-bb74-698b1aa93f64 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-9fb63acc-0415-4cd3-83e3-e90b080ebdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:45:22 np0005466012 nova_compute[192063]: 2025-10-02 12:45:22.849 2 DEBUG nova.compute.manager [req-3d06a9a8-5539-47d5-ba7a-e4221fe5d89a req-10b9f355-0918-465f-90a4-ce3af2a4b591 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-deleted-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.334 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-plugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.334 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.334 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.335 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.335 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-plugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.335 2 WARNING nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received unexpected event network-vif-plugged-555081c9-856e-4461-954c-839e380351df for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.335 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-plugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.335 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.336 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.336 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.336 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-plugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.336 2 WARNING nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received unexpected event network-vif-plugged-555081c9-856e-4461-954c-839e380351df for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.336 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-plugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.337 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.337 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.337 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.337 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-plugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.337 2 WARNING nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received unexpected event network-vif-plugged-555081c9-856e-4461-954c-839e380351df for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.338 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-unplugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.338 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.338 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.338 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.338 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-unplugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.339 2 WARNING nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received unexpected event network-vif-unplugged-555081c9-856e-4461-954c-839e380351df for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.339 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received event network-vif-plugged-555081c9-856e-4461-954c-839e380351df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.339 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.339 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.339 2 DEBUG oslo_concurrency.lockutils [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "9fb63acc-0415-4cd3-83e3-e90b080ebdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.340 2 DEBUG nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] No waiting events found dispatching network-vif-plugged-555081c9-856e-4461-954c-839e380351df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.340 2 WARNING nova.compute.manager [req-fb7ee4ad-a0f1-44a7-b196-d44ef6bd1459 req-44355ddf-4c15-48ea-a33c-f810f5e2f8a2 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Received unexpected event network-vif-plugged-555081c9-856e-4461-954c-839e380351df for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:45:23 np0005466012 nova_compute[192063]: 2025-10-02 12:45:23.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:26 np0005466012 nova_compute[192063]: 2025-10-02 12:45:26.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:26 np0005466012 nova_compute[192063]: 2025-10-02 12:45:26.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:26 np0005466012 nova_compute[192063]: 2025-10-02 12:45:26.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:28 np0005466012 nova_compute[192063]: 2025-10-02 12:45:28.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:31 np0005466012 nova_compute[192063]: 2025-10-02 12:45:31.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:33 np0005466012 nova_compute[192063]: 2025-10-02 12:45:33.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:34 np0005466012 podman[251845]: 2025-10-02 12:45:34.150663109 +0000 UTC m=+0.066019637 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 08:45:34 np0005466012 podman[251847]: 2025-10-02 12:45:34.154620176 +0000 UTC m=+0.066778157 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:45:34 np0005466012 podman[251846]: 2025-10-02 12:45:34.1712919 +0000 UTC m=+0.086098923 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:45:34 np0005466012 podman[251848]: 2025-10-02 12:45:34.216552641 +0000 UTC m=+0.120119509 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:45:36 np0005466012 nova_compute[192063]: 2025-10-02 12:45:36.186 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409121.1840372, 9fb63acc-0415-4cd3-83e3-e90b080ebdce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:45:36 np0005466012 nova_compute[192063]: 2025-10-02 12:45:36.186 2 INFO nova.compute.manager [-] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:45:36 np0005466012 nova_compute[192063]: 2025-10-02 12:45:36.205 2 DEBUG nova.compute.manager [None req-e752e9bc-9303-489e-b9c8-c4f9ff1f942a - - - - - -] [instance: 9fb63acc-0415-4cd3-83e3-e90b080ebdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:45:36 np0005466012 nova_compute[192063]: 2025-10-02 12:45:36.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:36 np0005466012 nova_compute[192063]: 2025-10-02 12:45:36.899 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:38 np0005466012 nova_compute[192063]: 2025-10-02 12:45:38.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:39 np0005466012 nova_compute[192063]: 2025-10-02 12:45:39.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:41 np0005466012 nova_compute[192063]: 2025-10-02 12:45:41.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:41 np0005466012 nova_compute[192063]: 2025-10-02 12:45:41.838 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:42 np0005466012 nova_compute[192063]: 2025-10-02 12:45:42.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:43 np0005466012 nova_compute[192063]: 2025-10-02 12:45:43.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:46 np0005466012 nova_compute[192063]: 2025-10-02 12:45:46.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:47 np0005466012 podman[251933]: 2025-10-02 12:45:47.157125749 +0000 UTC m=+0.072672717 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:45:47 np0005466012 podman[251934]: 2025-10-02 12:45:47.208432285 +0000 UTC m=+0.109418357 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 08:45:48 np0005466012 nova_compute[192063]: 2025-10-02 12:45:48.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:50 np0005466012 nova_compute[192063]: 2025-10-02 12:45:50.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:50 np0005466012 nova_compute[192063]: 2025-10-02 12:45:50.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:51 np0005466012 podman[251978]: 2025-10-02 12:45:51.140609708 +0000 UTC m=+0.053718252 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:45:51 np0005466012 podman[251977]: 2025-10-02 12:45:51.158726921 +0000 UTC m=+0.075114394 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:45:51 np0005466012 nova_compute[192063]: 2025-10-02 12:45:51.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.846 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.848 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.997 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.999 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5720MB free_disk=73.2425651550293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:45:53 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.999 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:53.999 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:54.064 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:54.065 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:54.150 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:54.163 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:54.178 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:45:54 np0005466012 nova_compute[192063]: 2025-10-02 12:45:54.179 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:45:56 np0005466012 nova_compute[192063]: 2025-10-02 12:45:56.180 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:56 np0005466012 nova_compute[192063]: 2025-10-02 12:45:56.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:57 np0005466012 nova_compute[192063]: 2025-10-02 12:45:57.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:57 np0005466012 nova_compute[192063]: 2025-10-02 12:45:57.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:45:57 np0005466012 nova_compute[192063]: 2025-10-02 12:45:57.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:45:57 np0005466012 nova_compute[192063]: 2025-10-02 12:45:57.836 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:45:58 np0005466012 nova_compute[192063]: 2025-10-02 12:45:58.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:45:59 np0005466012 nova_compute[192063]: 2025-10-02 12:45:59.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:45:59 np0005466012 nova_compute[192063]: 2025-10-02 12:45:59.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:46:01 np0005466012 nova_compute[192063]: 2025-10-02 12:46:01.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:02.162 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:02.162 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:02.163 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:03 np0005466012 nova_compute[192063]: 2025-10-02 12:46:03.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:05 np0005466012 podman[252025]: 2025-10-02 12:46:05.135177246 +0000 UTC m=+0.053897367 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct  2 08:46:05 np0005466012 podman[252024]: 2025-10-02 12:46:05.143673827 +0000 UTC m=+0.063971621 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:46:05 np0005466012 podman[252027]: 2025-10-02 12:46:05.16841733 +0000 UTC m=+0.081265261 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:46:05 np0005466012 podman[252026]: 2025-10-02 12:46:05.168683217 +0000 UTC m=+0.083073901 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:46:06 np0005466012 nova_compute[192063]: 2025-10-02 12:46:06.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:08 np0005466012 nova_compute[192063]: 2025-10-02 12:46:08.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:09 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:09Z|00735|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct  2 08:46:11 np0005466012 nova_compute[192063]: 2025-10-02 12:46:11.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:13 np0005466012 nova_compute[192063]: 2025-10-02 12:46:13.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:16 np0005466012 nova_compute[192063]: 2025-10-02 12:46:16.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:18 np0005466012 podman[252108]: 2025-10-02 12:46:18.140014409 +0000 UTC m=+0.056427595 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  2 08:46:18 np0005466012 podman[252109]: 2025-10-02 12:46:18.150653178 +0000 UTC m=+0.060001612 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Oct  2 08:46:18 np0005466012 nova_compute[192063]: 2025-10-02 12:46:18.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:18 np0005466012 nova_compute[192063]: 2025-10-02 12:46:18.911 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:18 np0005466012 nova_compute[192063]: 2025-10-02 12:46:18.912 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:18 np0005466012 nova_compute[192063]: 2025-10-02 12:46:18.934 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.087 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.087 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.094 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.094 2 INFO nova.compute.claims [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.211 2 DEBUG nova.compute.provider_tree [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.228 2 DEBUG nova.scheduler.client.report [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.256 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.256 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.311 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.311 2 DEBUG nova.network.neutron [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.336 2 INFO nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.365 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.574 2 DEBUG nova.policy [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.703 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.704 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.705 2 INFO nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Creating image(s)#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.705 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "/var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.706 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.706 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.721 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.776 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.778 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.779 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.796 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.861 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.862 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.894 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.895 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.895 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.955 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.956 2 DEBUG nova.virt.disk.api [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Checking if we can resize image /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:46:19 np0005466012 nova_compute[192063]: 2025-10-02 12:46:19.957 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.018 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.020 2 DEBUG nova.virt.disk.api [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Cannot resize image /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.020 2 DEBUG nova.objects.instance [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'migration_context' on Instance uuid 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.203 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.203 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Ensure instance console log exists: /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.204 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.204 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.204 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:20.447 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:20 np0005466012 nova_compute[192063]: 2025-10-02 12:46:20.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:20 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:20.449 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:46:21 np0005466012 nova_compute[192063]: 2025-10-02 12:46:21.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:21 np0005466012 nova_compute[192063]: 2025-10-02 12:46:21.627 2 DEBUG nova.network.neutron [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Successfully created port: cdf6286f-21ac-44f2-91ce-bd52882502e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:46:22 np0005466012 podman[252164]: 2025-10-02 12:46:22.147252424 +0000 UTC m=+0.047215495 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:46:22 np0005466012 podman[252163]: 2025-10-02 12:46:22.154451579 +0000 UTC m=+0.057418662 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.338 2 DEBUG nova.network.neutron [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Successfully updated port: cdf6286f-21ac-44f2-91ce-bd52882502e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.353 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "refresh_cache-2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.354 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquired lock "refresh_cache-2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.354 2 DEBUG nova.network.neutron [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.436 2 DEBUG nova.compute.manager [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-changed-cdf6286f-21ac-44f2-91ce-bd52882502e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.436 2 DEBUG nova.compute.manager [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Refreshing instance network info cache due to event network-changed-cdf6286f-21ac-44f2-91ce-bd52882502e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.436 2 DEBUG oslo_concurrency.lockutils [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:46:22 np0005466012 nova_compute[192063]: 2025-10-02 12:46:22.489 2 DEBUG nova.network.neutron [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:46:23 np0005466012 nova_compute[192063]: 2025-10-02 12:46:23.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.607 2 DEBUG nova.network.neutron [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Updating instance_info_cache with network_info: [{"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.628 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Releasing lock "refresh_cache-2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.629 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Instance network_info: |[{"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.630 2 DEBUG oslo_concurrency.lockutils [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.630 2 DEBUG nova.network.neutron [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Refreshing network info cache for port cdf6286f-21ac-44f2-91ce-bd52882502e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.636 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Start _get_guest_xml network_info=[{"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.641 2 WARNING nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.646 2 DEBUG nova.virt.libvirt.host [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.648 2 DEBUG nova.virt.libvirt.host [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.653 2 DEBUG nova.virt.libvirt.host [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.654 2 DEBUG nova.virt.libvirt.host [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.655 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.656 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.657 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.658 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.658 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.659 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.659 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.660 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.660 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.661 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.662 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.662 2 DEBUG nova.virt.hardware [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.671 2 DEBUG nova.virt.libvirt.vif [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:46:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ge',id=182,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGsUKfvQsRFH/GldSVzED6JnM2R8DeZMSLqFM+7ZoEbCSUSgEpS2XwQTay0eRWx3t/E5S4rEWdCjCoc+0nrAH+n3s9z8s5WA+sL/sdupqrDO9IWm9qn8ROfjJ4EtbzYHtg==',key_name='tempest-TestSecurityGroupsBasicOps-880121214',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-s6u8yqen',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:46:19Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=2ade6d5e-55bd-4ef0-9c58-a67e941ceee3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.672 2 DEBUG nova.network.os_vif_util [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.673 2 DEBUG nova.network.os_vif_util [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.674 2 DEBUG nova.objects.instance [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.696 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <uuid>2ade6d5e-55bd-4ef0-9c58-a67e941ceee3</uuid>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <name>instance-000000b6</name>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686</nova:name>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:46:26</nova:creationTime>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:user uuid="2d2b4a2da57543ef88e44ae28ad61647">tempest-TestSecurityGroupsBasicOps-1020134341-project-member</nova:user>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:project uuid="575f3d227ab24f2daa62e65e14a4cd9c">tempest-TestSecurityGroupsBasicOps-1020134341</nova:project>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        <nova:port uuid="cdf6286f-21ac-44f2-91ce-bd52882502e6">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <entry name="serial">2ade6d5e-55bd-4ef0-9c58-a67e941ceee3</entry>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <entry name="uuid">2ade6d5e-55bd-4ef0-9c58-a67e941ceee3</entry>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.config"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:e0:e6:6e"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <target dev="tapcdf6286f-21"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/console.log" append="off"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:46:26 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:46:26 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:46:26 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:46:26 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.698 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Preparing to wait for external event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.699 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.699 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.700 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.701 2 DEBUG nova.virt.libvirt.vif [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:46:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ge',id=182,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGsUKfvQsRFH/GldSVzED6JnM2R8DeZMSLqFM+7ZoEbCSUSgEpS2XwQTay0eRWx3t/E5S4rEWdCjCoc+0nrAH+n3s9z8s5WA+sL/sdupqrDO9IWm9qn8ROfjJ4EtbzYHtg==',key_name='tempest-TestSecurityGroupsBasicOps-880121214',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-s6u8yqen',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:46:19Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=2ade6d5e-55bd-4ef0-9c58-a67e941ceee3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.701 2 DEBUG nova.network.os_vif_util [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.702 2 DEBUG nova.network.os_vif_util [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.703 2 DEBUG os_vif [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.709 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcdf6286f-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcdf6286f-21, col_values=(('external_ids', {'iface-id': 'cdf6286f-21ac-44f2-91ce-bd52882502e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:e6:6e', 'vm-uuid': '2ade6d5e-55bd-4ef0-9c58-a67e941ceee3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466012 NetworkManager[51207]: <info>  [1759409186.7136] manager: (tapcdf6286f-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.721 2 INFO os_vif [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21')#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.796 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.797 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.797 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No VIF found with MAC fa:16:3e:e0:e6:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:46:26 np0005466012 nova_compute[192063]: 2025-10-02 12:46:26.798 2 INFO nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Using config drive#033[00m
Oct  2 08:46:27 np0005466012 nova_compute[192063]: 2025-10-02 12:46:27.712 2 INFO nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Creating config drive at /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.config#033[00m
Oct  2 08:46:27 np0005466012 nova_compute[192063]: 2025-10-02 12:46:27.716 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx9txw65h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:46:27 np0005466012 nova_compute[192063]: 2025-10-02 12:46:27.855 2 DEBUG oslo_concurrency.processutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx9txw65h" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:46:27 np0005466012 kernel: tapcdf6286f-21: entered promiscuous mode
Oct  2 08:46:27 np0005466012 NetworkManager[51207]: <info>  [1759409187.9297] manager: (tapcdf6286f-21): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Oct  2 08:46:27 np0005466012 nova_compute[192063]: 2025-10-02 12:46:27.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:27Z|00736|binding|INFO|Claiming lport cdf6286f-21ac-44f2-91ce-bd52882502e6 for this chassis.
Oct  2 08:46:27 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:27Z|00737|binding|INFO|cdf6286f-21ac-44f2-91ce-bd52882502e6: Claiming fa:16:3e:e0:e6:6e 10.100.0.5
Oct  2 08:46:27 np0005466012 nova_compute[192063]: 2025-10-02 12:46:27.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:27 np0005466012 nova_compute[192063]: 2025-10-02 12:46:27.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:27 np0005466012 NetworkManager[51207]: <info>  [1759409187.9404] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Oct  2 08:46:27 np0005466012 NetworkManager[51207]: <info>  [1759409187.9413] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.948 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:e6:6e 10.100.0.5'], port_security=['fa:16:3e:e0:e6:6e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ade6d5e-55bd-4ef0-9c58-a67e941ceee3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1be73b2-0596-4634-bbd8-c2bd6eae245c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=170a325d-4ee6-4f9e-99b6-aa2be81235b5, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=cdf6286f-21ac-44f2-91ce-bd52882502e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.949 103246 INFO neutron.agent.ovn.metadata.agent [-] Port cdf6286f-21ac-44f2-91ce-bd52882502e6 in datapath 85e4aed1-4716-45af-bcd8-38b9aeff1c42 bound to our chassis#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.950 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85e4aed1-4716-45af-bcd8-38b9aeff1c42#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.968 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4c65ecbd-e2a9-4fdf-a53b-838e1fbd697e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.969 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85e4aed1-41 in ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.971 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85e4aed1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.971 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c1fa49-241e-4260-9a01-278137687c09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.973 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[58855265-f595-43e8-a284-cd2b42c9c24b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:27 np0005466012 systemd-udevd[252224]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:46:27 np0005466012 systemd-machined[152114]: New machine qemu-81-instance-000000b6.
Oct  2 08:46:27 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:27.988 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[a16bd2ec-5137-42f0-a55f-e0f417e1cc24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:27 np0005466012 NetworkManager[51207]: <info>  [1759409187.9933] device (tapcdf6286f-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:46:27 np0005466012 NetworkManager[51207]: <info>  [1759409187.9944] device (tapcdf6286f-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.012 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[67ae15c1-5922-46fb-a2dc-eb7f50890ade]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 systemd[1]: Started Virtual Machine qemu-81-instance-000000b6.
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:28Z|00738|binding|INFO|Setting lport cdf6286f-21ac-44f2-91ce-bd52882502e6 ovn-installed in OVS
Oct  2 08:46:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:28Z|00739|binding|INFO|Setting lport cdf6286f-21ac-44f2-91ce-bd52882502e6 up in Southbound
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.049 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[5439e813-c1ed-4ad0-8111-76424f279e13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 NetworkManager[51207]: <info>  [1759409188.0566] manager: (tap85e4aed1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/347)
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.057 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6505388d-01d3-4ec2-8c56-37799ef22690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.087 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[29330f94-1b22-4843-a575-e8fdab201336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.090 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[76d97658-6546-49e9-9d21-41aaf042035a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 NetworkManager[51207]: <info>  [1759409188.1133] device (tap85e4aed1-40): carrier: link connected
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.117 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[054e25e7-ee56-4f05-b01e-c52643177831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.134 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5599da-c37e-4267-a11e-5569c89f594e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85e4aed1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c6:bd:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718172, 'reachable_time': 31754, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252256, 'error': None, 'target': 'ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.151 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[93f97705-9d6a-445c-82ca-1eec7224309d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec6:bd06'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 718172, 'tstamp': 718172}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252257, 'error': None, 'target': 'ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.171 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e76e5995-2260-4356-b26d-6257437f1187]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85e4aed1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c6:bd:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718172, 'reachable_time': 31754, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252258, 'error': None, 'target': 'ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.199 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aca12f4f-21bd-4f4c-b2f4-48af7aeaf8d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.250 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c950f000-3495-4647-8ec4-600e176d0eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.251 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85e4aed1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.251 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.252 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85e4aed1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:28 np0005466012 NetworkManager[51207]: <info>  [1759409188.2544] manager: (tap85e4aed1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 kernel: tap85e4aed1-40: entered promiscuous mode
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.259 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85e4aed1-40, col_values=(('external_ids', {'iface-id': 'cc53a2f5-12aa-4c54-8e29-196f3f838f7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:28Z|00740|binding|INFO|Releasing lport cc53a2f5-12aa-4c54-8e29-196f3f838f7d from this chassis (sb_readonly=0)
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.274 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85e4aed1-4716-45af-bcd8-38b9aeff1c42.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85e4aed1-4716-45af-bcd8-38b9aeff1c42.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.274 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7b070764-59c8-444c-ba52-1d01ae0e8767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.275 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-85e4aed1-4716-45af-bcd8-38b9aeff1c42
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/85e4aed1-4716-45af-bcd8-38b9aeff1c42.pid.haproxy
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 85e4aed1-4716-45af-bcd8-38b9aeff1c42
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:46:28 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:28.276 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'env', 'PROCESS_TAG=haproxy-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85e4aed1-4716-45af-bcd8-38b9aeff1c42.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:46:28 np0005466012 podman[252290]: 2025-10-02 12:46:28.601151052 +0000 UTC m=+0.022514963 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:46:28 np0005466012 podman[252290]: 2025-10-02 12:46:28.701503741 +0000 UTC m=+0.122867642 container create f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:46:28 np0005466012 systemd[1]: Started libpod-conmon-f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1.scope.
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.804 2 DEBUG nova.compute.manager [req-cdfbb3de-eeba-4e05-b5e6-c176db2b3ff2 req-9d40d96f-fde0-49a5-8436-f114c76f9ea6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.805 2 DEBUG oslo_concurrency.lockutils [req-cdfbb3de-eeba-4e05-b5e6-c176db2b3ff2 req-9d40d96f-fde0-49a5-8436-f114c76f9ea6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.805 2 DEBUG oslo_concurrency.lockutils [req-cdfbb3de-eeba-4e05-b5e6-c176db2b3ff2 req-9d40d96f-fde0-49a5-8436-f114c76f9ea6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.806 2 DEBUG oslo_concurrency.lockutils [req-cdfbb3de-eeba-4e05-b5e6-c176db2b3ff2 req-9d40d96f-fde0-49a5-8436-f114c76f9ea6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.806 2 DEBUG nova.compute.manager [req-cdfbb3de-eeba-4e05-b5e6-c176db2b3ff2 req-9d40d96f-fde0-49a5-8436-f114c76f9ea6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Processing event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:28 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:46:28 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab528e6dc5a1a3814c6d5cf1006ce90c3a4c81c097abda639f885a17a63b870f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:46:28 np0005466012 podman[252290]: 2025-10-02 12:46:28.841860719 +0000 UTC m=+0.263224620 container init f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:46:28 np0005466012 podman[252290]: 2025-10-02 12:46:28.847602476 +0000 UTC m=+0.268966357 container start f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:46:28 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [NOTICE]   (252317) : New worker (252320) forked
Oct  2 08:46:28 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [NOTICE]   (252317) : Loading success.
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.931 2 DEBUG nova.network.neutron [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Updated VIF entry in instance network info cache for port cdf6286f-21ac-44f2-91ce-bd52882502e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.932 2 DEBUG nova.network.neutron [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Updating instance_info_cache with network_info: [{"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:28 np0005466012 nova_compute[192063]: 2025-10-02 12:46:28.950 2 DEBUG oslo_concurrency.lockutils [req-061f34b5-099c-4642-97a1-aa97fd7d106d req-aa0b2b61-0f97-4327-a466-f4ad390a884e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.276 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.278 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409189.2759068, 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.278 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] VM Started (Lifecycle Event)#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.281 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.284 2 INFO nova.virt.libvirt.driver [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Instance spawned successfully.#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.285 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.329 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.330 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.330 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.331 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.331 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.332 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.332 2 DEBUG nova.virt.libvirt.driver [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.336 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.395 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.396 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409189.2761033, 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.396 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.416 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.420 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409189.2804425, 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.420 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.441 2 INFO nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Took 9.74 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.441 2 DEBUG nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.443 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.448 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:46:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:29.451 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.472 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.525 2 INFO nova.compute.manager [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Took 10.52 seconds to build instance.#033[00m
Oct  2 08:46:29 np0005466012 nova_compute[192063]: 2025-10-02 12:46:29.540 2 DEBUG oslo_concurrency.lockutils [None req-df2e7e29-ce6b-4b9a-a67c-b44166148c99 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:30 np0005466012 nova_compute[192063]: 2025-10-02 12:46:30.899 2 DEBUG nova.compute.manager [req-b79d1e21-4ac4-416f-bcab-86707c1b52e6 req-0dcda8a8-099a-45f2-9ba4-947710aed368 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:30 np0005466012 nova_compute[192063]: 2025-10-02 12:46:30.899 2 DEBUG oslo_concurrency.lockutils [req-b79d1e21-4ac4-416f-bcab-86707c1b52e6 req-0dcda8a8-099a-45f2-9ba4-947710aed368 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:30 np0005466012 nova_compute[192063]: 2025-10-02 12:46:30.900 2 DEBUG oslo_concurrency.lockutils [req-b79d1e21-4ac4-416f-bcab-86707c1b52e6 req-0dcda8a8-099a-45f2-9ba4-947710aed368 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:30 np0005466012 nova_compute[192063]: 2025-10-02 12:46:30.900 2 DEBUG oslo_concurrency.lockutils [req-b79d1e21-4ac4-416f-bcab-86707c1b52e6 req-0dcda8a8-099a-45f2-9ba4-947710aed368 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:30 np0005466012 nova_compute[192063]: 2025-10-02 12:46:30.900 2 DEBUG nova.compute.manager [req-b79d1e21-4ac4-416f-bcab-86707c1b52e6 req-0dcda8a8-099a-45f2-9ba4-947710aed368 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] No waiting events found dispatching network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:30 np0005466012 nova_compute[192063]: 2025-10-02 12:46:30.900 2 WARNING nova.compute.manager [req-b79d1e21-4ac4-416f-bcab-86707c1b52e6 req-0dcda8a8-099a-45f2-9ba4-947710aed368 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received unexpected event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:46:31 np0005466012 nova_compute[192063]: 2025-10-02 12:46:31.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:33 np0005466012 nova_compute[192063]: 2025-10-02 12:46:33.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:36 np0005466012 podman[252331]: 2025-10-02 12:46:36.141180224 +0000 UTC m=+0.056342203 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:46:36 np0005466012 podman[252330]: 2025-10-02 12:46:36.157363785 +0000 UTC m=+0.064307241 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:46:36 np0005466012 podman[252332]: 2025-10-02 12:46:36.168183109 +0000 UTC m=+0.080217153 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:46:36 np0005466012 podman[252333]: 2025-10-02 12:46:36.215792014 +0000 UTC m=+0.122871233 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:46:36 np0005466012 nova_compute[192063]: 2025-10-02 12:46:36.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:36 np0005466012 nova_compute[192063]: 2025-10-02 12:46:36.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:38 np0005466012 nova_compute[192063]: 2025-10-02 12:46:38.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:41 np0005466012 nova_compute[192063]: 2025-10-02 12:46:41.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:42 np0005466012 nova_compute[192063]: 2025-10-02 12:46:42.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:43Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:e6:6e 10.100.0.5
Oct  2 08:46:43 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:43Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:e6:6e 10.100.0.5
Oct  2 08:46:43 np0005466012 nova_compute[192063]: 2025-10-02 12:46:43.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:44 np0005466012 nova_compute[192063]: 2025-10-02 12:46:44.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:46 np0005466012 nova_compute[192063]: 2025-10-02 12:46:46.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:48 np0005466012 nova_compute[192063]: 2025-10-02 12:46:48.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:49 np0005466012 podman[252428]: 2025-10-02 12:46:49.132160073 +0000 UTC m=+0.050176185 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Oct  2 08:46:49 np0005466012 podman[252427]: 2025-10-02 12:46:49.135821113 +0000 UTC m=+0.054572946 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.820 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.820 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.821 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.821 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.821 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.835 2 INFO nova.compute.manager [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Terminating instance#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.845 2 DEBUG nova.compute.manager [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:46:49 np0005466012 kernel: tapcdf6286f-21 (unregistering): left promiscuous mode
Oct  2 08:46:49 np0005466012 NetworkManager[51207]: <info>  [1759409209.8692] device (tapcdf6286f-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:46:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:49Z|00741|binding|INFO|Releasing lport cdf6286f-21ac-44f2-91ce-bd52882502e6 from this chassis (sb_readonly=0)
Oct  2 08:46:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:49Z|00742|binding|INFO|Setting lport cdf6286f-21ac-44f2-91ce-bd52882502e6 down in Southbound
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:49 np0005466012 ovn_controller[94284]: 2025-10-02T12:46:49Z|00743|binding|INFO|Removing iface tapcdf6286f-21 ovn-installed in OVS
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:49 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:49.888 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:e6:6e 10.100.0.5'], port_security=['fa:16:3e:e0:e6:6e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ade6d5e-55bd-4ef0-9c58-a67e941ceee3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1be73b2-0596-4634-bbd8-c2bd6eae245c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=170a325d-4ee6-4f9e-99b6-aa2be81235b5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=cdf6286f-21ac-44f2-91ce-bd52882502e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:46:49 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:49.889 103246 INFO neutron.agent.ovn.metadata.agent [-] Port cdf6286f-21ac-44f2-91ce-bd52882502e6 in datapath 85e4aed1-4716-45af-bcd8-38b9aeff1c42 unbound from our chassis#033[00m
Oct  2 08:46:49 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:49.890 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85e4aed1-4716-45af-bcd8-38b9aeff1c42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:46:49 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:49.892 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[20657605-39b4-4358-953c-e3bb5e8f3fae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:49 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:49.892 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42 namespace which is not needed anymore#033[00m
Oct  2 08:46:49 np0005466012 nova_compute[192063]: 2025-10-02 12:46:49.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:49 np0005466012 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Oct  2 08:46:49 np0005466012 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000b6.scope: Consumed 14.065s CPU time.
Oct  2 08:46:49 np0005466012 systemd-machined[152114]: Machine qemu-81-instance-000000b6 terminated.
Oct  2 08:46:50 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [NOTICE]   (252317) : haproxy version is 2.8.14-c23fe91
Oct  2 08:46:50 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [NOTICE]   (252317) : path to executable is /usr/sbin/haproxy
Oct  2 08:46:50 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [WARNING]  (252317) : Exiting Master process...
Oct  2 08:46:50 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [ALERT]    (252317) : Current worker (252320) exited with code 143 (Terminated)
Oct  2 08:46:50 np0005466012 neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42[252307]: [WARNING]  (252317) : All workers exited. Exiting... (0)
Oct  2 08:46:50 np0005466012 systemd[1]: libpod-f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1.scope: Deactivated successfully.
Oct  2 08:46:50 np0005466012 conmon[252307]: conmon f708422ea67ac6e63c57 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1.scope/container/memory.events
Oct  2 08:46:50 np0005466012 podman[252490]: 2025-10-02 12:46:50.021344352 +0000 UTC m=+0.043566747 container died f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:46:50 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1-userdata-shm.mount: Deactivated successfully.
Oct  2 08:46:50 np0005466012 systemd[1]: var-lib-containers-storage-overlay-ab528e6dc5a1a3814c6d5cf1006ce90c3a4c81c097abda639f885a17a63b870f-merged.mount: Deactivated successfully.
Oct  2 08:46:50 np0005466012 podman[252490]: 2025-10-02 12:46:50.078473125 +0000 UTC m=+0.100695520 container cleanup f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct  2 08:46:50 np0005466012 systemd[1]: libpod-conmon-f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1.scope: Deactivated successfully.
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.104 2 INFO nova.virt.libvirt.driver [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Instance destroyed successfully.#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.105 2 DEBUG nova.objects.instance [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'resources' on Instance uuid 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.129 2 DEBUG nova.virt.libvirt.vif [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:46:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-gen-0-4392686',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ge',id=182,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGsUKfvQsRFH/GldSVzED6JnM2R8DeZMSLqFM+7ZoEbCSUSgEpS2XwQTay0eRWx3t/E5S4rEWdCjCoc+0nrAH+n3s9z8s5WA+sL/sdupqrDO9IWm9qn8ROfjJ4EtbzYHtg==',key_name='tempest-TestSecurityGroupsBasicOps-880121214',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:46:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-s6u8yqen',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:46:29Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=2ade6d5e-55bd-4ef0-9c58-a67e941ceee3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.129 2 DEBUG nova.network.os_vif_util [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "address": "fa:16:3e:e0:e6:6e", "network": {"id": "85e4aed1-4716-45af-bcd8-38b9aeff1c42", "bridge": "br-int", "label": "tempest-network-smoke--13042790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcdf6286f-21", "ovs_interfaceid": "cdf6286f-21ac-44f2-91ce-bd52882502e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.130 2 DEBUG nova.network.os_vif_util [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.130 2 DEBUG os_vif [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.132 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcdf6286f-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.137 2 INFO os_vif [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:e6:6e,bridge_name='br-int',has_traffic_filtering=True,id=cdf6286f-21ac-44f2-91ce-bd52882502e6,network=Network(85e4aed1-4716-45af-bcd8-38b9aeff1c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcdf6286f-21')#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.138 2 INFO nova.virt.libvirt.driver [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Deleting instance files /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3_del#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.138 2 INFO nova.virt.libvirt.driver [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Deletion of /var/lib/nova/instances/2ade6d5e-55bd-4ef0-9c58-a67e941ceee3_del complete#033[00m
Oct  2 08:46:50 np0005466012 podman[252535]: 2025-10-02 12:46:50.148189292 +0000 UTC m=+0.049231880 container remove f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.153 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[53b2b9ec-ec57-4b8c-b1fb-9f0d67fa9b31]: (4, ('Thu Oct  2 12:46:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42 (f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1)\nf708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1\nThu Oct  2 12:46:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42 (f708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1)\nf708422ea67ac6e63c576a9ca3690fc4fa33ec64aa9f13d1d4d01fb97f7f6ae1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.154 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[f861f145-9c5a-480e-9fe6-c70d4936a55a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.155 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85e4aed1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:50 np0005466012 kernel: tap85e4aed1-40: left promiscuous mode
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.170 2 DEBUG nova.compute.manager [req-485512de-cd0f-4787-a67e-7258a189ea08 req-a30926b3-ff34-474d-bc7f-1cb270e7d89e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-vif-unplugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.171 2 DEBUG oslo_concurrency.lockutils [req-485512de-cd0f-4787-a67e-7258a189ea08 req-a30926b3-ff34-474d-bc7f-1cb270e7d89e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.170 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[aea69160-6bad-49ad-8999-c92cccb8947c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.171 2 DEBUG oslo_concurrency.lockutils [req-485512de-cd0f-4787-a67e-7258a189ea08 req-a30926b3-ff34-474d-bc7f-1cb270e7d89e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.171 2 DEBUG oslo_concurrency.lockutils [req-485512de-cd0f-4787-a67e-7258a189ea08 req-a30926b3-ff34-474d-bc7f-1cb270e7d89e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.171 2 DEBUG nova.compute.manager [req-485512de-cd0f-4787-a67e-7258a189ea08 req-a30926b3-ff34-474d-bc7f-1cb270e7d89e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] No waiting events found dispatching network-vif-unplugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.172 2 DEBUG nova.compute.manager [req-485512de-cd0f-4787-a67e-7258a189ea08 req-a30926b3-ff34-474d-bc7f-1cb270e7d89e 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-vif-unplugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.201 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fd286e3b-71a2-47a6-97c9-b87a830c0823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.202 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c3971601-15e4-4ce1-87c5-b529a6bd7fc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.216 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[187ff3d5-4541-4a16-8d64-2d6ac5afe839]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718165, 'reachable_time': 18635, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252553, 'error': None, 'target': 'ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.222 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85e4aed1-4716-45af-bcd8-38b9aeff1c42 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:46:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:46:50.222 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1a6e86-4eea-4432-b51e-b1d8f0e14216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:46:50 np0005466012 systemd[1]: run-netns-ovnmeta\x2d85e4aed1\x2d4716\x2d45af\x2dbcd8\x2d38b9aeff1c42.mount: Deactivated successfully.
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.225 2 INFO nova.compute.manager [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.226 2 DEBUG oslo.service.loopingcall [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.226 2 DEBUG nova.compute.manager [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.226 2 DEBUG nova.network.neutron [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.863 2 DEBUG nova.network.neutron [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.890 2 INFO nova.compute.manager [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Took 0.66 seconds to deallocate network for instance.#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.955 2 DEBUG nova.compute.manager [req-71f3478e-50ac-43d8-80db-de7df3065ec7 req-5033df7d-cf01-4cd6-8dc2-853613d619b6 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-vif-deleted-cdf6286f-21ac-44f2-91ce-bd52882502e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.960 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:50 np0005466012 nova_compute[192063]: 2025-10-02 12:46:50.960 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:51 np0005466012 nova_compute[192063]: 2025-10-02 12:46:51.022 2 DEBUG nova.compute.provider_tree [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:46:51 np0005466012 nova_compute[192063]: 2025-10-02 12:46:51.034 2 DEBUG nova.scheduler.client.report [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:46:51 np0005466012 nova_compute[192063]: 2025-10-02 12:46:51.053 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:51 np0005466012 nova_compute[192063]: 2025-10-02 12:46:51.138 2 INFO nova.scheduler.client.report [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Deleted allocations for instance 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3#033[00m
Oct  2 08:46:51 np0005466012 nova_compute[192063]: 2025-10-02 12:46:51.209 2 DEBUG oslo_concurrency.lockutils [None req-8575ad6f-bf42-4f9e-9374-0839eed54b2e 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:51 np0005466012 nova_compute[192063]: 2025-10-02 12:46:51.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.302 2 DEBUG nova.compute.manager [req-77526dcb-e88e-4dc4-94e4-3e4e845a2686 req-2098d3c2-d0fa-4bb7-87fb-d1840f5f1d5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.303 2 DEBUG oslo_concurrency.lockutils [req-77526dcb-e88e-4dc4-94e4-3e4e845a2686 req-2098d3c2-d0fa-4bb7-87fb-d1840f5f1d5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.303 2 DEBUG oslo_concurrency.lockutils [req-77526dcb-e88e-4dc4-94e4-3e4e845a2686 req-2098d3c2-d0fa-4bb7-87fb-d1840f5f1d5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.303 2 DEBUG oslo_concurrency.lockutils [req-77526dcb-e88e-4dc4-94e4-3e4e845a2686 req-2098d3c2-d0fa-4bb7-87fb-d1840f5f1d5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "2ade6d5e-55bd-4ef0-9c58-a67e941ceee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.303 2 DEBUG nova.compute.manager [req-77526dcb-e88e-4dc4-94e4-3e4e845a2686 req-2098d3c2-d0fa-4bb7-87fb-d1840f5f1d5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] No waiting events found dispatching network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.304 2 WARNING nova.compute.manager [req-77526dcb-e88e-4dc4-94e4-3e4e845a2686 req-2098d3c2-d0fa-4bb7-87fb-d1840f5f1d5b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Received unexpected event network-vif-plugged-cdf6286f-21ac-44f2-91ce-bd52882502e6 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:46:52 np0005466012 nova_compute[192063]: 2025-10-02 12:46:52.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:53 np0005466012 podman[252555]: 2025-10-02 12:46:53.142711118 +0000 UTC m=+0.054421331 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:46:53 np0005466012 podman[252554]: 2025-10-02 12:46:53.142678267 +0000 UTC m=+0.058154343 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:46:53 np0005466012 nova_compute[192063]: 2025-10-02 12:46:53.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.862 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.863 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.863 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:55 np0005466012 nova_compute[192063]: 2025-10-02 12:46:55.863 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.023 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.026 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5707MB free_disk=73.24492263793945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.026 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.026 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.086 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.086 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.101 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.120 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.120 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.135 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.160 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.181 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.195 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.217 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:46:56 np0005466012 nova_compute[192063]: 2025-10-02 12:46:56.217 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:46:57 np0005466012 nova_compute[192063]: 2025-10-02 12:46:57.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:57 np0005466012 nova_compute[192063]: 2025-10-02 12:46:57.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:46:58 np0005466012 nova_compute[192063]: 2025-10-02 12:46:58.219 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:46:58 np0005466012 nova_compute[192063]: 2025-10-02 12:46:58.219 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:46:58 np0005466012 nova_compute[192063]: 2025-10-02 12:46:58.219 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:46:58 np0005466012 nova_compute[192063]: 2025-10-02 12:46:58.235 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:46:58 np0005466012 nova_compute[192063]: 2025-10-02 12:46:58.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:00 np0005466012 nova_compute[192063]: 2025-10-02 12:47:00.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:01 np0005466012 nova_compute[192063]: 2025-10-02 12:47:01.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:01 np0005466012 nova_compute[192063]: 2025-10-02 12:47:01.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:47:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:02.162 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:02.163 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:02.163 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:02 np0005466012 nova_compute[192063]: 2025-10-02 12:47:02.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:03 np0005466012 nova_compute[192063]: 2025-10-02 12:47:03.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:05 np0005466012 nova_compute[192063]: 2025-10-02 12:47:05.103 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409210.101495, 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:05 np0005466012 nova_compute[192063]: 2025-10-02 12:47:05.103 2 INFO nova.compute.manager [-] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:47:05 np0005466012 nova_compute[192063]: 2025-10-02 12:47:05.125 2 DEBUG nova.compute.manager [None req-bca41505-3f8a-4e84-812f-2bd1ce5e4125 - - - - - -] [instance: 2ade6d5e-55bd-4ef0-9c58-a67e941ceee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:05 np0005466012 nova_compute[192063]: 2025-10-02 12:47:05.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:07 np0005466012 podman[252600]: 2025-10-02 12:47:07.143997379 +0000 UTC m=+0.049513808 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:47:07 np0005466012 podman[252598]: 2025-10-02 12:47:07.149525779 +0000 UTC m=+0.067916938 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:47:07 np0005466012 podman[252599]: 2025-10-02 12:47:07.156497889 +0000 UTC m=+0.074233550 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  2 08:47:07 np0005466012 podman[252601]: 2025-10-02 12:47:07.175549837 +0000 UTC m=+0.081925729 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:47:08 np0005466012 nova_compute[192063]: 2025-10-02 12:47:08.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:10 np0005466012 nova_compute[192063]: 2025-10-02 12:47:10.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:13 np0005466012 nova_compute[192063]: 2025-10-02 12:47:13.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:15 np0005466012 nova_compute[192063]: 2025-10-02 12:47:15.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:47:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:47:17 np0005466012 nova_compute[192063]: 2025-10-02 12:47:17.979 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:17 np0005466012 nova_compute[192063]: 2025-10-02 12:47:17.980 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.025 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.188 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.188 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.198 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.199 2 INFO nova.compute.claims [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.488 2 DEBUG nova.compute.provider_tree [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.524 2 DEBUG nova.scheduler.client.report [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.723 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.723 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.846 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.846 2 DEBUG nova.network.neutron [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.874 2 INFO nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:47:18 np0005466012 nova_compute[192063]: 2025-10-02 12:47:18.907 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.434 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.435 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.435 2 INFO nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Creating image(s)#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.436 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "/var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.436 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.437 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "/var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.449 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.508 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.509 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.509 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.520 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.577 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.578 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.612 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.613 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.613 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.677 2 DEBUG nova.policy [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d2b4a2da57543ef88e44ae28ad61647', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.680 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.680 2 DEBUG nova.virt.disk.api [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Checking if we can resize image /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.681 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.737 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.738 2 DEBUG nova.virt.disk.api [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Cannot resize image /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.738 2 DEBUG nova.objects.instance [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'migration_context' on Instance uuid e6d26f23-1d78-4f7b-adde-1387f16aaeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.760 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.760 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Ensure instance console log exists: /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.761 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.761 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:19 np0005466012 nova_compute[192063]: 2025-10-02 12:47:19.761 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:20 np0005466012 podman[252694]: 2025-10-02 12:47:20.144594489 +0000 UTC m=+0.060107807 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  2 08:47:20 np0005466012 nova_compute[192063]: 2025-10-02 12:47:20.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:20 np0005466012 podman[252693]: 2025-10-02 12:47:20.175464798 +0000 UTC m=+0.094382168 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  2 08:47:20 np0005466012 nova_compute[192063]: 2025-10-02 12:47:20.433 2 DEBUG nova.network.neutron [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Successfully created port: 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:47:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:21.743 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:21 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:21.744 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.783 2 DEBUG nova.network.neutron [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Successfully updated port: 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.801 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.801 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquired lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.801 2 DEBUG nova.network.neutron [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.888 2 DEBUG nova.compute.manager [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-changed-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.889 2 DEBUG nova.compute.manager [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Refreshing instance network info cache due to event network-changed-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.889 2 DEBUG oslo_concurrency.lockutils [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:21 np0005466012 nova_compute[192063]: 2025-10-02 12:47:21.926 2 DEBUG nova.network.neutron [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:47:23 np0005466012 nova_compute[192063]: 2025-10-02 12:47:23.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:24 np0005466012 podman[252733]: 2025-10-02 12:47:24.139666711 +0000 UTC m=+0.050931906 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:47:24 np0005466012 podman[252732]: 2025-10-02 12:47:24.146534168 +0000 UTC m=+0.058576085 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.616 2 DEBUG nova.network.neutron [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updating instance_info_cache with network_info: [{"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.732 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Releasing lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.732 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Instance network_info: |[{"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.733 2 DEBUG oslo_concurrency.lockutils [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.733 2 DEBUG nova.network.neutron [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Refreshing network info cache for port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.735 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Start _get_guest_xml network_info=[{"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.741 2 WARNING nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.746 2 DEBUG nova.virt.libvirt.host [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.747 2 DEBUG nova.virt.libvirt.host [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.755 2 DEBUG nova.virt.libvirt.host [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.755 2 DEBUG nova.virt.libvirt.host [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.757 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.757 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.757 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.757 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.758 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.758 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.758 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.758 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.759 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.759 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.759 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.759 2 DEBUG nova.virt.hardware [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.763 2 DEBUG nova.virt.libvirt.vif [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=183,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMmIHzI7f5NkeR5wWa61rMo9tbPcN91tFpOGed+RslBJVNKID2goKHiH+DxnBZjgvWNRTusEL4R4MJdMOsVO3CDBF8QprS49jKe89ioFjf2s7IRZnrqEDh4O0tau1eVScQ==',key_name='tempest-TestSecurityGroupsBasicOps-1950075361',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-i96lq4ly',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:47:19Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=e6d26f23-1d78-4f7b-adde-1387f16aaeed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.764 2 DEBUG nova.network.os_vif_util [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.764 2 DEBUG nova.network.os_vif_util [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.765 2 DEBUG nova.objects.instance [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'pci_devices' on Instance uuid e6d26f23-1d78-4f7b-adde-1387f16aaeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.780 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <uuid>e6d26f23-1d78-4f7b-adde-1387f16aaeed</uuid>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <name>instance-000000b7</name>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228</nova:name>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:47:24</nova:creationTime>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:user uuid="2d2b4a2da57543ef88e44ae28ad61647">tempest-TestSecurityGroupsBasicOps-1020134341-project-member</nova:user>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:project uuid="575f3d227ab24f2daa62e65e14a4cd9c">tempest-TestSecurityGroupsBasicOps-1020134341</nova:project>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        <nova:port uuid="85ae1b5d-fe3a-443e-ade0-fdd2150d9e40">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <entry name="serial">e6d26f23-1d78-4f7b-adde-1387f16aaeed</entry>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <entry name="uuid">e6d26f23-1d78-4f7b-adde-1387f16aaeed</entry>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.config"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:bf:8c:f1"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <target dev="tap85ae1b5d-fe"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/console.log" append="off"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:47:24 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:47:24 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:47:24 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:47:24 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.781 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Preparing to wait for external event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.782 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.782 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.782 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.783 2 DEBUG nova.virt.libvirt.vif [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T12:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=183,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMmIHzI7f5NkeR5wWa61rMo9tbPcN91tFpOGed+RslBJVNKID2goKHiH+DxnBZjgvWNRTusEL4R4MJdMOsVO3CDBF8QprS49jKe89ioFjf2s7IRZnrqEDh4O0tau1eVScQ==',key_name='tempest-TestSecurityGroupsBasicOps-1950075361',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-i96lq4ly',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:47:19Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=e6d26f23-1d78-4f7b-adde-1387f16aaeed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.783 2 DEBUG nova.network.os_vif_util [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.784 2 DEBUG nova.network.os_vif_util [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.784 2 DEBUG os_vif [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.785 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.786 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85ae1b5d-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap85ae1b5d-fe, col_values=(('external_ids', {'iface-id': '85ae1b5d-fe3a-443e-ade0-fdd2150d9e40', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:8c:f1', 'vm-uuid': 'e6d26f23-1d78-4f7b-adde-1387f16aaeed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:24 np0005466012 NetworkManager[51207]: <info>  [1759409244.7927] manager: (tap85ae1b5d-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.798 2 INFO os_vif [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe')#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.849 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.850 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.850 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] No VIF found with MAC fa:16:3e:bf:8c:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:47:24 np0005466012 nova_compute[192063]: 2025-10-02 12:47:24.851 2 INFO nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Using config drive#033[00m
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.736 2 INFO nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Creating config drive at /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.config#033[00m
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.741 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp24ad1t6u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.865 2 DEBUG oslo_concurrency.processutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp24ad1t6u" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:25 np0005466012 kernel: tap85ae1b5d-fe: entered promiscuous mode
Oct  2 08:47:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:25Z|00744|binding|INFO|Claiming lport 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 for this chassis.
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:25Z|00745|binding|INFO|85ae1b5d-fe3a-443e-ade0-fdd2150d9e40: Claiming fa:16:3e:bf:8c:f1 10.100.0.5
Oct  2 08:47:25 np0005466012 NetworkManager[51207]: <info>  [1759409245.9236] manager: (tap85ae1b5d-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/350)
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:25 np0005466012 systemd-udevd[252795]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:47:25 np0005466012 systemd-machined[152114]: New machine qemu-82-instance-000000b7.
Oct  2 08:47:25 np0005466012 NetworkManager[51207]: <info>  [1759409245.9650] device (tap85ae1b5d-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:47:25 np0005466012 NetworkManager[51207]: <info>  [1759409245.9662] device (tap85ae1b5d-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:25 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:25Z|00746|binding|INFO|Setting lport 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 ovn-installed in OVS
Oct  2 08:47:25 np0005466012 nova_compute[192063]: 2025-10-02 12:47:25.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:25 np0005466012 systemd[1]: Started Virtual Machine qemu-82-instance-000000b7.
Oct  2 08:47:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:26Z|00747|binding|INFO|Setting lport 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 up in Southbound
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.069 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:8c:f1 10.100.0.5'], port_security=['fa:16:3e:bf:8c:f1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e6d26f23-1d78-4f7b-adde-1387f16aaeed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4198d545-60c1-41ba-a17b-a91177c26fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f95d8b7-9744-4cdc-a87e-5a8413c44948 ad6e0ec7-e922-4419-8f49-680ad830e0b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e7452e-fef8-4fbe-94d5-963a56929892, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.070 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 in datapath 4198d545-60c1-41ba-a17b-a91177c26fff bound to our chassis#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.071 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4198d545-60c1-41ba-a17b-a91177c26fff#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.082 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[38cbf00f-b0cb-4187-8f0e-161582144e67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.083 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4198d545-61 in ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.085 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4198d545-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.085 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[3e850c72-4957-4409-a2f4-bebc5ae5fa86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.086 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7dcb6e-1f52-4552-8e0b-ff5bf95803a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.098 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[0e519c30-2ff4-4a98-a116-dbec6ebe46b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.114 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[031ead86-1bc2-4228-a27a-0550e3a5a648]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.143 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[91834e39-0ddf-486c-95d6-e976bf1ab571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 NetworkManager[51207]: <info>  [1759409246.1513] manager: (tap4198d545-60): new Veth device (/org/freedesktop/NetworkManager/Devices/351)
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.152 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7897004a-c03e-4e9a-ace1-cff8c6d30e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.186 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e17e8c-1fe1-4685-b960-1ab9005e313b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.190 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[b77425c9-b14f-4a5b-ad6a-7b29eb0d5f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 NetworkManager[51207]: <info>  [1759409246.2094] device (tap4198d545-60): carrier: link connected
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.213 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3c75e3cc-2493-464c-a34f-aabee311c830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.231 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4360e5-c9b7-4cdf-a113-59d1e378f50d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4198d545-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:ac:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 723982, 'reachable_time': 42462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252829, 'error': None, 'target': 'ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.244 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4ec5cb-74a3-439a-a80a-99afaf5afad8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:ac99'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 723982, 'tstamp': 723982}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252830, 'error': None, 'target': 'ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.364 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[81cc6216-772b-450e-844a-eb682631e67b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4198d545-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:ac:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 723982, 'reachable_time': 42462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252831, 'error': None, 'target': 'ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.397 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[13e7d530-da0d-4704-a7d9-1d50a3421c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.452 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6af08730-14f9-460d-b3bb-72ad93585a8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.453 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4198d545-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.453 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.454 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4198d545-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:26 np0005466012 nova_compute[192063]: 2025-10-02 12:47:26.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:26 np0005466012 NetworkManager[51207]: <info>  [1759409246.4564] manager: (tap4198d545-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Oct  2 08:47:26 np0005466012 kernel: tap4198d545-60: entered promiscuous mode
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.459 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4198d545-60, col_values=(('external_ids', {'iface-id': 'ffc1b5db-a867-4861-8586-8c95deb82f3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:26 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:26Z|00748|binding|INFO|Releasing lport ffc1b5db-a867-4861-8586-8c95deb82f3b from this chassis (sb_readonly=0)
Oct  2 08:47:26 np0005466012 nova_compute[192063]: 2025-10-02 12:47:26.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:26 np0005466012 nova_compute[192063]: 2025-10-02 12:47:26.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.474 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4198d545-60c1-41ba-a17b-a91177c26fff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4198d545-60c1-41ba-a17b-a91177c26fff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.475 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8c7223c6-1894-4a0c-b1f4-cb47584a9ef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.476 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-4198d545-60c1-41ba-a17b-a91177c26fff
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/4198d545-60c1-41ba-a17b-a91177c26fff.pid.haproxy
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 4198d545-60c1-41ba-a17b-a91177c26fff
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:47:26 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:26.477 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff', 'env', 'PROCESS_TAG=haproxy-4198d545-60c1-41ba-a17b-a91177c26fff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4198d545-60c1-41ba-a17b-a91177c26fff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:47:26 np0005466012 nova_compute[192063]: 2025-10-02 12:47:26.861 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409246.8610518, e6d26f23-1d78-4f7b-adde-1387f16aaeed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:26 np0005466012 nova_compute[192063]: 2025-10-02 12:47:26.862 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] VM Started (Lifecycle Event)#033[00m
Oct  2 08:47:26 np0005466012 podman[252870]: 2025-10-02 12:47:26.801822767 +0000 UTC m=+0.022472032 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:47:27 np0005466012 podman[252870]: 2025-10-02 12:47:27.143209144 +0000 UTC m=+0.363858409 container create 847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:47:27 np0005466012 systemd[1]: Started libpod-conmon-847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28.scope.
Oct  2 08:47:27 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:47:27 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f3b1a05e75c80995f6976703c4b800bd1201718ef70efc43311eecf66b10fc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:47:27 np0005466012 podman[252870]: 2025-10-02 12:47:27.462043358 +0000 UTC m=+0.682692643 container init 847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:47:27 np0005466012 podman[252870]: 2025-10-02 12:47:27.467863796 +0000 UTC m=+0.688513101 container start 847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:47:27 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [NOTICE]   (252889) : New worker (252891) forked
Oct  2 08:47:27 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [NOTICE]   (252889) : Loading success.
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.916 2 DEBUG nova.compute.manager [req-7b7eef77-b91b-4820-a5de-6cbda9805346 req-11ee5416-9fbe-4a5f-b692-0492e3f5878a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.917 2 DEBUG oslo_concurrency.lockutils [req-7b7eef77-b91b-4820-a5de-6cbda9805346 req-11ee5416-9fbe-4a5f-b692-0492e3f5878a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.917 2 DEBUG oslo_concurrency.lockutils [req-7b7eef77-b91b-4820-a5de-6cbda9805346 req-11ee5416-9fbe-4a5f-b692-0492e3f5878a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.917 2 DEBUG oslo_concurrency.lockutils [req-7b7eef77-b91b-4820-a5de-6cbda9805346 req-11ee5416-9fbe-4a5f-b692-0492e3f5878a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.918 2 DEBUG nova.compute.manager [req-7b7eef77-b91b-4820-a5de-6cbda9805346 req-11ee5416-9fbe-4a5f-b692-0492e3f5878a 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Processing event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.918 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.922 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.926 2 INFO nova.virt.libvirt.driver [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Instance spawned successfully.#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.926 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.954 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.954 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.955 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.955 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.956 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.956 2 DEBUG nova.virt.libvirt.driver [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.967 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.970 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.999 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.999 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409246.8614426, e6d26f23-1d78-4f7b-adde-1387f16aaeed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:27 np0005466012 nova_compute[192063]: 2025-10-02 12:47:27.999 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.075 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.079 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409247.9221427, e6d26f23-1d78-4f7b-adde-1387f16aaeed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.079 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.226 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.231 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.288 2 INFO nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Took 8.85 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.288 2 DEBUG nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.302 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.799 2 DEBUG nova.network.neutron [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updated VIF entry in instance network info cache for port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.800 2 DEBUG nova.network.neutron [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updating instance_info_cache with network_info: [{"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.868 2 INFO nova.compute.manager [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Took 10.74 seconds to build instance.#033[00m
Oct  2 08:47:28 np0005466012 nova_compute[192063]: 2025-10-02 12:47:28.881 2 DEBUG oslo_concurrency.lockutils [req-9d2a4464-2d39-4f76-ac6c-7b13f40b62c0 req-ee1b7a47-ab15-46f7-ba8b-67e259775ad1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:29 np0005466012 nova_compute[192063]: 2025-10-02 12:47:29.159 2 DEBUG oslo_concurrency.lockutils [None req-c2d4aedf-9ce3-484d-8796-99f91961b005 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:29 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:29.746 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:29 np0005466012 nova_compute[192063]: 2025-10-02 12:47:29.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:30 np0005466012 nova_compute[192063]: 2025-10-02 12:47:30.131 2 DEBUG nova.compute.manager [req-12bc72af-597c-4d44-9a3e-ab043b69585c req-eaa057b4-f41f-442e-843e-d31450c3e3d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:30 np0005466012 nova_compute[192063]: 2025-10-02 12:47:30.131 2 DEBUG oslo_concurrency.lockutils [req-12bc72af-597c-4d44-9a3e-ab043b69585c req-eaa057b4-f41f-442e-843e-d31450c3e3d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:30 np0005466012 nova_compute[192063]: 2025-10-02 12:47:30.131 2 DEBUG oslo_concurrency.lockutils [req-12bc72af-597c-4d44-9a3e-ab043b69585c req-eaa057b4-f41f-442e-843e-d31450c3e3d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:30 np0005466012 nova_compute[192063]: 2025-10-02 12:47:30.132 2 DEBUG oslo_concurrency.lockutils [req-12bc72af-597c-4d44-9a3e-ab043b69585c req-eaa057b4-f41f-442e-843e-d31450c3e3d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:30 np0005466012 nova_compute[192063]: 2025-10-02 12:47:30.132 2 DEBUG nova.compute.manager [req-12bc72af-597c-4d44-9a3e-ab043b69585c req-eaa057b4-f41f-442e-843e-d31450c3e3d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] No waiting events found dispatching network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:47:30 np0005466012 nova_compute[192063]: 2025-10-02 12:47:30.133 2 WARNING nova.compute.manager [req-12bc72af-597c-4d44-9a3e-ab043b69585c req-eaa057b4-f41f-442e-843e-d31450c3e3d1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received unexpected event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:47:33 np0005466012 nova_compute[192063]: 2025-10-02 12:47:33.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:34 np0005466012 NetworkManager[51207]: <info>  [1759409254.0593] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Oct  2 08:47:34 np0005466012 NetworkManager[51207]: <info>  [1759409254.0602] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Oct  2 08:47:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:34Z|00749|binding|INFO|Releasing lport ffc1b5db-a867-4861-8586-8c95deb82f3b from this chassis (sb_readonly=0)
Oct  2 08:47:34 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:34Z|00750|binding|INFO|Releasing lport ffc1b5db-a867-4861-8586-8c95deb82f3b from this chassis (sb_readonly=0)
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.934 2 DEBUG nova.compute.manager [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-changed-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.935 2 DEBUG nova.compute.manager [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Refreshing instance network info cache due to event network-changed-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.935 2 DEBUG oslo_concurrency.lockutils [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.936 2 DEBUG oslo_concurrency.lockutils [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:34 np0005466012 nova_compute[192063]: 2025-10-02 12:47:34.936 2 DEBUG nova.network.neutron [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Refreshing network info cache for port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:47:36 np0005466012 nova_compute[192063]: 2025-10-02 12:47:36.619 2 DEBUG nova.network.neutron [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updated VIF entry in instance network info cache for port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:47:36 np0005466012 nova_compute[192063]: 2025-10-02 12:47:36.620 2 DEBUG nova.network.neutron [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updating instance_info_cache with network_info: [{"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:36 np0005466012 nova_compute[192063]: 2025-10-02 12:47:36.682 2 DEBUG oslo_concurrency.lockutils [req-6078f8e0-12dc-46f2-b635-7a06c7864371 req-d9e4d362-8abf-4c6d-98b5-71dd3b543b69 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:47:36 np0005466012 nova_compute[192063]: 2025-10-02 12:47:36.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:38 np0005466012 podman[252902]: 2025-10-02 12:47:38.154349258 +0000 UTC m=+0.063315594 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 08:47:38 np0005466012 podman[252903]: 2025-10-02 12:47:38.157666068 +0000 UTC m=+0.066495050 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:47:38 np0005466012 podman[252904]: 2025-10-02 12:47:38.192895535 +0000 UTC m=+0.093475363 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 08:47:38 np0005466012 podman[252901]: 2025-10-02 12:47:38.194857859 +0000 UTC m=+0.100597967 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:47:38 np0005466012 nova_compute[192063]: 2025-10-02 12:47:38.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:39 np0005466012 nova_compute[192063]: 2025-10-02 12:47:39.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:43 np0005466012 nova_compute[192063]: 2025-10-02 12:47:43.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:43 np0005466012 nova_compute[192063]: 2025-10-02 12:47:43.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:44Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:8c:f1 10.100.0.5
Oct  2 08:47:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:44Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:8c:f1 10.100.0.5
Oct  2 08:47:44 np0005466012 nova_compute[192063]: 2025-10-02 12:47:44.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:46 np0005466012 nova_compute[192063]: 2025-10-02 12:47:46.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:48 np0005466012 nova_compute[192063]: 2025-10-02 12:47:48.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:49 np0005466012 nova_compute[192063]: 2025-10-02 12:47:49.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:51 np0005466012 podman[253004]: 2025-10-02 12:47:51.145917202 +0000 UTC m=+0.062152761 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Oct  2 08:47:51 np0005466012 podman[253003]: 2025-10-02 12:47:51.151788152 +0000 UTC m=+0.067782005 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:47:52 np0005466012 nova_compute[192063]: 2025-10-02 12:47:52.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:52 np0005466012 nova_compute[192063]: 2025-10-02 12:47:52.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:53 np0005466012 nova_compute[192063]: 2025-10-02 12:47:53.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:54 np0005466012 nova_compute[192063]: 2025-10-02 12:47:54.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:55 np0005466012 podman[253047]: 2025-10-02 12:47:55.139559528 +0000 UTC m=+0.045436667 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:47:55 np0005466012 podman[253046]: 2025-10-02 12:47:55.141551952 +0000 UTC m=+0.051730629 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.852 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.852 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.852 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.852 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.907 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.966 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:55 np0005466012 nova_compute[192063]: 2025-10-02 12:47:55.967 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.023 2 DEBUG oslo_concurrency.processutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.170 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.171 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5535MB free_disk=73.21586990356445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.172 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.172 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.269 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Instance e6d26f23-1d78-4f7b-adde-1387f16aaeed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.270 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.270 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.318 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.331 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.353 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.353 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.790 2 DEBUG nova.compute.manager [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-changed-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.791 2 DEBUG nova.compute.manager [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Refreshing instance network info cache due to event network-changed-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.791 2 DEBUG oslo_concurrency.lockutils [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.791 2 DEBUG oslo_concurrency.lockutils [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.791 2 DEBUG nova.network.neutron [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Refreshing network info cache for port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.875 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.875 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.876 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.876 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.876 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.890 2 INFO nova.compute.manager [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Terminating instance#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.903 2 DEBUG nova.compute.manager [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:47:56 np0005466012 kernel: tap85ae1b5d-fe (unregistering): left promiscuous mode
Oct  2 08:47:56 np0005466012 NetworkManager[51207]: <info>  [1759409276.9334] device (tap85ae1b5d-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:47:56 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:56Z|00751|binding|INFO|Releasing lport 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 from this chassis (sb_readonly=0)
Oct  2 08:47:56 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:56Z|00752|binding|INFO|Setting lport 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 down in Southbound
Oct  2 08:47:56 np0005466012 ovn_controller[94284]: 2025-10-02T12:47:56Z|00753|binding|INFO|Removing iface tap85ae1b5d-fe ovn-installed in OVS
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:56.954 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:8c:f1 10.100.0.5'], port_security=['fa:16:3e:bf:8c:f1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e6d26f23-1d78-4f7b-adde-1387f16aaeed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4198d545-60c1-41ba-a17b-a91177c26fff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '575f3d227ab24f2daa62e65e14a4cd9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f95d8b7-9744-4cdc-a87e-5a8413c44948 ad6e0ec7-e922-4419-8f49-680ad830e0b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e7452e-fef8-4fbe-94d5-963a56929892, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:56.955 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 in datapath 4198d545-60c1-41ba-a17b-a91177c26fff unbound from our chassis#033[00m
Oct  2 08:47:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:56.956 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4198d545-60c1-41ba-a17b-a91177c26fff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:47:56 np0005466012 nova_compute[192063]: 2025-10-02 12:47:56.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:56.959 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3fdf3b-c4da-41c0-9fae-2a5a304a7c7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:56.961 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff namespace which is not needed anymore#033[00m
Oct  2 08:47:57 np0005466012 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Oct  2 08:47:57 np0005466012 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b7.scope: Consumed 14.967s CPU time.
Oct  2 08:47:57 np0005466012 systemd-machined[152114]: Machine qemu-82-instance-000000b7 terminated.
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.178 2 INFO nova.virt.libvirt.driver [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Instance destroyed successfully.#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.180 2 DEBUG nova.objects.instance [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lazy-loading 'resources' on Instance uuid e6d26f23-1d78-4f7b-adde-1387f16aaeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:47:57 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [NOTICE]   (252889) : haproxy version is 2.8.14-c23fe91
Oct  2 08:47:57 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [NOTICE]   (252889) : path to executable is /usr/sbin/haproxy
Oct  2 08:47:57 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [WARNING]  (252889) : Exiting Master process...
Oct  2 08:47:57 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [WARNING]  (252889) : Exiting Master process...
Oct  2 08:47:57 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [ALERT]    (252889) : Current worker (252891) exited with code 143 (Terminated)
Oct  2 08:47:57 np0005466012 neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff[252885]: [WARNING]  (252889) : All workers exited. Exiting... (0)
Oct  2 08:47:57 np0005466012 systemd[1]: libpod-847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28.scope: Deactivated successfully.
Oct  2 08:47:57 np0005466012 podman[253121]: 2025-10-02 12:47:57.204875448 +0000 UTC m=+0.150387802 container died 847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.218 2 DEBUG nova.virt.libvirt.vif [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1020134341-access_point-294961228',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1020134341-ac',id=183,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMmIHzI7f5NkeR5wWa61rMo9tbPcN91tFpOGed+RslBJVNKID2goKHiH+DxnBZjgvWNRTusEL4R4MJdMOsVO3CDBF8QprS49jKe89ioFjf2s7IRZnrqEDh4O0tau1eVScQ==',key_name='tempest-TestSecurityGroupsBasicOps-1950075361',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:47:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='575f3d227ab24f2daa62e65e14a4cd9c',ramdisk_id='',reservation_id='r-i96lq4ly',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1020134341',owner_user_name='tempest-TestSecurityGroupsBasicOps-1020134341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:47:28Z,user_data=None,user_id='2d2b4a2da57543ef88e44ae28ad61647',uuid=e6d26f23-1d78-4f7b-adde-1387f16aaeed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.219 2 DEBUG nova.network.os_vif_util [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converting VIF {"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.220 2 DEBUG nova.network.os_vif_util [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.221 2 DEBUG os_vif [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.223 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85ae1b5d-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.230 2 INFO os_vif [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:8c:f1,bridge_name='br-int',has_traffic_filtering=True,id=85ae1b5d-fe3a-443e-ade0-fdd2150d9e40,network=Network(4198d545-60c1-41ba-a17b-a91177c26fff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85ae1b5d-fe')#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.231 2 INFO nova.virt.libvirt.driver [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Deleting instance files /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed_del#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.231 2 INFO nova.virt.libvirt.driver [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Deletion of /var/lib/nova/instances/e6d26f23-1d78-4f7b-adde-1387f16aaeed_del complete#033[00m
Oct  2 08:47:57 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28-userdata-shm.mount: Deactivated successfully.
Oct  2 08:47:57 np0005466012 systemd[1]: var-lib-containers-storage-overlay-7f3b1a05e75c80995f6976703c4b800bd1201718ef70efc43311eecf66b10fc9-merged.mount: Deactivated successfully.
Oct  2 08:47:57 np0005466012 podman[253121]: 2025-10-02 12:47:57.275958661 +0000 UTC m=+0.221471015 container cleanup 847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:47:57 np0005466012 systemd[1]: libpod-conmon-847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28.scope: Deactivated successfully.
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.301 2 INFO nova.compute.manager [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.302 2 DEBUG oslo.service.loopingcall [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.302 2 DEBUG nova.compute.manager [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.302 2 DEBUG nova.network.neutron [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:47:57 np0005466012 podman[253167]: 2025-10-02 12:47:57.350426227 +0000 UTC m=+0.042545658 container remove 847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.355 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[accf6df3-a2fb-4cdc-9dca-d89074e569d5]: (4, ('Thu Oct  2 12:47:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff (847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28)\n847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28\nThu Oct  2 12:47:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff (847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28)\n847a5a4b9935d5fef8fa21c564c9ed57457cfa8a0b70e620c19081a6efd50b28\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.357 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[bee81d2a-c694-4c61-970a-85c65c8bd481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.358 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4198d545-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:57 np0005466012 kernel: tap4198d545-60: left promiscuous mode
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.406 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[056cd840-026a-4f7a-a359-17ea5f4d1e30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.439 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa19b79-b63d-4cf4-8444-c1c54703ade5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.440 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[11a53479-8a23-466c-8dba-0fe8d8d9cd41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.454 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[19d501ad-a63b-422c-b04a-04947c216431]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 723975, 'reachable_time': 42733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253181, 'error': None, 'target': 'ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.458 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4198d545-60c1-41ba-a17b-a91177c26fff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:47:57 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:57.458 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[9f530c9f-8696-4a71-83f7-cf67d885f610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:47:57 np0005466012 systemd[1]: run-netns-ovnmeta\x2d4198d545\x2d60c1\x2d41ba\x2da17b\x2da91177c26fff.mount: Deactivated successfully.
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.806 2 DEBUG nova.compute.manager [req-d476a97d-c485-4bb6-97e0-c55abb3df139 req-9b85b607-3531-4680-b388-9ab9f0185ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-vif-unplugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.806 2 DEBUG oslo_concurrency.lockutils [req-d476a97d-c485-4bb6-97e0-c55abb3df139 req-9b85b607-3531-4680-b388-9ab9f0185ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.807 2 DEBUG oslo_concurrency.lockutils [req-d476a97d-c485-4bb6-97e0-c55abb3df139 req-9b85b607-3531-4680-b388-9ab9f0185ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.807 2 DEBUG oslo_concurrency.lockutils [req-d476a97d-c485-4bb6-97e0-c55abb3df139 req-9b85b607-3531-4680-b388-9ab9f0185ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.807 2 DEBUG nova.compute.manager [req-d476a97d-c485-4bb6-97e0-c55abb3df139 req-9b85b607-3531-4680-b388-9ab9f0185ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] No waiting events found dispatching network-vif-unplugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:47:57 np0005466012 nova_compute[192063]: 2025-10-02 12:47:57.807 2 DEBUG nova.compute.manager [req-d476a97d-c485-4bb6-97e0-c55abb3df139 req-9b85b607-3531-4680-b388-9ab9f0185ed1 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-vif-unplugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:47:58 np0005466012 nova_compute[192063]: 2025-10-02 12:47:58.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.353 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.353 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.354 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.553 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.553 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:47:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:59.675 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:47:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:59.677 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:47:59 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:47:59.679 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.724 2 DEBUG nova.network.neutron [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.758 2 INFO nova.compute.manager [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Took 2.46 seconds to deallocate network for instance.#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.834 2 DEBUG nova.compute.manager [req-2e91e610-7de0-4f27-becc-8670ca805b6a req-d14b1b22-a9da-4fc8-aff4-92a20909b680 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-vif-deleted-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.911 2 DEBUG nova.compute.manager [req-8aea0397-0b64-41bc-80be-c428c523fb97 req-6fb16466-a79e-4462-a3f5-280ff6f2cc86 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.911 2 DEBUG oslo_concurrency.lockutils [req-8aea0397-0b64-41bc-80be-c428c523fb97 req-6fb16466-a79e-4462-a3f5-280ff6f2cc86 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.911 2 DEBUG oslo_concurrency.lockutils [req-8aea0397-0b64-41bc-80be-c428c523fb97 req-6fb16466-a79e-4462-a3f5-280ff6f2cc86 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.911 2 DEBUG oslo_concurrency.lockutils [req-8aea0397-0b64-41bc-80be-c428c523fb97 req-6fb16466-a79e-4462-a3f5-280ff6f2cc86 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.912 2 DEBUG nova.compute.manager [req-8aea0397-0b64-41bc-80be-c428c523fb97 req-6fb16466-a79e-4462-a3f5-280ff6f2cc86 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] No waiting events found dispatching network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.912 2 WARNING nova.compute.manager [req-8aea0397-0b64-41bc-80be-c428c523fb97 req-6fb16466-a79e-4462-a3f5-280ff6f2cc86 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Received unexpected event network-vif-plugged-85ae1b5d-fe3a-443e-ade0-fdd2150d9e40 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.918 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.918 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.964 2 DEBUG nova.compute.provider_tree [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:47:59 np0005466012 nova_compute[192063]: 2025-10-02 12:47:59.987 2 DEBUG nova.scheduler.client.report [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:48:00 np0005466012 nova_compute[192063]: 2025-10-02 12:48:00.021 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:00 np0005466012 nova_compute[192063]: 2025-10-02 12:48:00.092 2 INFO nova.scheduler.client.report [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Deleted allocations for instance e6d26f23-1d78-4f7b-adde-1387f16aaeed#033[00m
Oct  2 08:48:00 np0005466012 nova_compute[192063]: 2025-10-02 12:48:00.132 2 DEBUG nova.network.neutron [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updated VIF entry in instance network info cache for port 85ae1b5d-fe3a-443e-ade0-fdd2150d9e40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:48:00 np0005466012 nova_compute[192063]: 2025-10-02 12:48:00.133 2 DEBUG nova.network.neutron [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Updating instance_info_cache with network_info: [{"id": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "address": "fa:16:3e:bf:8c:f1", "network": {"id": "4198d545-60c1-41ba-a17b-a91177c26fff", "bridge": "br-int", "label": "tempest-network-smoke--1086452248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "575f3d227ab24f2daa62e65e14a4cd9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85ae1b5d-fe", "ovs_interfaceid": "85ae1b5d-fe3a-443e-ade0-fdd2150d9e40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:48:00 np0005466012 nova_compute[192063]: 2025-10-02 12:48:00.209 2 DEBUG oslo_concurrency.lockutils [req-fc78883c-18bf-477b-8044-1b5b4647768d req-4d38df02-42d7-4b79-894f-be8a3e90e068 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-e6d26f23-1d78-4f7b-adde-1387f16aaeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:48:00 np0005466012 nova_compute[192063]: 2025-10-02 12:48:00.218 2 DEBUG oslo_concurrency.lockutils [None req-62d8d51f-5051-432a-abc8-7dbc0a075482 2d2b4a2da57543ef88e44ae28ad61647 575f3d227ab24f2daa62e65e14a4cd9c - - default default] Lock "e6d26f23-1d78-4f7b-adde-1387f16aaeed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:01 np0005466012 nova_compute[192063]: 2025-10-02 12:48:01.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:01 np0005466012 nova_compute[192063]: 2025-10-02 12:48:01.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:48:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:48:02.163 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:48:02.163 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:48:02.164 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:02 np0005466012 nova_compute[192063]: 2025-10-02 12:48:02.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:03 np0005466012 nova_compute[192063]: 2025-10-02 12:48:03.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:05 np0005466012 nova_compute[192063]: 2025-10-02 12:48:05.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:05 np0005466012 nova_compute[192063]: 2025-10-02 12:48:05.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:07 np0005466012 nova_compute[192063]: 2025-10-02 12:48:07.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:08 np0005466012 nova_compute[192063]: 2025-10-02 12:48:08.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:09 np0005466012 podman[253185]: 2025-10-02 12:48:09.149898735 +0000 UTC m=+0.060116366 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:48:09 np0005466012 podman[253183]: 2025-10-02 12:48:09.157631025 +0000 UTC m=+0.069957104 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Oct  2 08:48:09 np0005466012 podman[253184]: 2025-10-02 12:48:09.1578025 +0000 UTC m=+0.072684648 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:48:09 np0005466012 podman[253186]: 2025-10-02 12:48:09.179933782 +0000 UTC m=+0.085503197 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  2 08:48:12 np0005466012 nova_compute[192063]: 2025-10-02 12:48:12.175 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409277.1738827, e6d26f23-1d78-4f7b-adde-1387f16aaeed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:48:12 np0005466012 nova_compute[192063]: 2025-10-02 12:48:12.176 2 INFO nova.compute.manager [-] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:48:12 np0005466012 nova_compute[192063]: 2025-10-02 12:48:12.208 2 DEBUG nova.compute.manager [None req-e605a2ec-34c4-489a-9c21-00a872a5e497 - - - - - -] [instance: e6d26f23-1d78-4f7b-adde-1387f16aaeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:48:12 np0005466012 nova_compute[192063]: 2025-10-02 12:48:12.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:13 np0005466012 nova_compute[192063]: 2025-10-02 12:48:13.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:17 np0005466012 nova_compute[192063]: 2025-10-02 12:48:17.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:18 np0005466012 nova_compute[192063]: 2025-10-02 12:48:18.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:22 np0005466012 podman[253266]: 2025-10-02 12:48:22.141044578 +0000 UTC m=+0.059450208 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 08:48:22 np0005466012 podman[253267]: 2025-10-02 12:48:22.172411152 +0000 UTC m=+0.080970274 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, distribution-scope=public, version=9.6, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350)
Oct  2 08:48:22 np0005466012 nova_compute[192063]: 2025-10-02 12:48:22.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:23 np0005466012 nova_compute[192063]: 2025-10-02 12:48:23.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:26 np0005466012 podman[253305]: 2025-10-02 12:48:26.1566379 +0000 UTC m=+0.074975971 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:48:26 np0005466012 podman[253304]: 2025-10-02 12:48:26.171665639 +0000 UTC m=+0.083933345 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:48:27 np0005466012 nova_compute[192063]: 2025-10-02 12:48:27.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:28 np0005466012 nova_compute[192063]: 2025-10-02 12:48:28.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:32 np0005466012 nova_compute[192063]: 2025-10-02 12:48:32.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:33 np0005466012 nova_compute[192063]: 2025-10-02 12:48:33.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:36 np0005466012 nova_compute[192063]: 2025-10-02 12:48:36.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:37 np0005466012 nova_compute[192063]: 2025-10-02 12:48:37.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:38 np0005466012 nova_compute[192063]: 2025-10-02 12:48:38.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:40 np0005466012 podman[253347]: 2025-10-02 12:48:40.131974185 +0000 UTC m=+0.048016128 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  2 08:48:40 np0005466012 podman[253346]: 2025-10-02 12:48:40.132496609 +0000 UTC m=+0.052365576 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  2 08:48:40 np0005466012 podman[253348]: 2025-10-02 12:48:40.164618482 +0000 UTC m=+0.076785369 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:48:40 np0005466012 podman[253349]: 2025-10-02 12:48:40.176379132 +0000 UTC m=+0.084078398 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:48:41 np0005466012 ovn_controller[94284]: 2025-10-02T12:48:41Z|00754|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  2 08:48:42 np0005466012 nova_compute[192063]: 2025-10-02 12:48:42.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:43 np0005466012 nova_compute[192063]: 2025-10-02 12:48:43.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:44 np0005466012 nova_compute[192063]: 2025-10-02 12:48:44.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:47 np0005466012 nova_compute[192063]: 2025-10-02 12:48:47.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:48 np0005466012 nova_compute[192063]: 2025-10-02 12:48:48.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:48 np0005466012 nova_compute[192063]: 2025-10-02 12:48:48.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:52 np0005466012 nova_compute[192063]: 2025-10-02 12:48:52.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:52 np0005466012 nova_compute[192063]: 2025-10-02 12:48:52.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:53 np0005466012 podman[253434]: 2025-10-02 12:48:53.154384688 +0000 UTC m=+0.063192309 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:48:53 np0005466012 podman[253435]: 2025-10-02 12:48:53.169051597 +0000 UTC m=+0.083405339 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git)
Oct  2 08:48:53 np0005466012 nova_compute[192063]: 2025-10-02 12:48:53.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:54 np0005466012 nova_compute[192063]: 2025-10-02 12:48:54.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:55 np0005466012 nova_compute[192063]: 2025-10-02 12:48:55.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:55 np0005466012 nova_compute[192063]: 2025-10-02 12:48:55.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:55 np0005466012 nova_compute[192063]: 2025-10-02 12:48:55.847 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:55 np0005466012 nova_compute[192063]: 2025-10-02 12:48:55.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:55 np0005466012 nova_compute[192063]: 2025-10-02 12:48:55.848 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.007 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.008 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5733MB free_disk=73.24486923217773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.008 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.008 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.079 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.079 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.105 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.130 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.167 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:48:56 np0005466012 nova_compute[192063]: 2025-10-02 12:48:56.168 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:48:57 np0005466012 podman[253476]: 2025-10-02 12:48:57.13505407 +0000 UTC m=+0.054326069 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:48:57 np0005466012 podman[253477]: 2025-10-02 12:48:57.15380977 +0000 UTC m=+0.063854648 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:48:57 np0005466012 nova_compute[192063]: 2025-10-02 12:48:57.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:58 np0005466012 nova_compute[192063]: 2025-10-02 12:48:58.168 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:58 np0005466012 nova_compute[192063]: 2025-10-02 12:48:58.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:48:59 np0005466012 nova_compute[192063]: 2025-10-02 12:48:59.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:48:59 np0005466012 nova_compute[192063]: 2025-10-02 12:48:59.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:48:59 np0005466012 nova_compute[192063]: 2025-10-02 12:48:59.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:48:59 np0005466012 nova_compute[192063]: 2025-10-02 12:48:59.848 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:49:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:49:02.163 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:49:02.164 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:49:02.164 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:02 np0005466012 nova_compute[192063]: 2025-10-02 12:49:02.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:02 np0005466012 nova_compute[192063]: 2025-10-02 12:49:02.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:02 np0005466012 nova_compute[192063]: 2025-10-02 12:49:02.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:49:03 np0005466012 nova_compute[192063]: 2025-10-02 12:49:03.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:49:05.517 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:49:05 np0005466012 nova_compute[192063]: 2025-10-02 12:49:05.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:05 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:49:05.519 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:49:05 np0005466012 nova_compute[192063]: 2025-10-02 12:49:05.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:07 np0005466012 nova_compute[192063]: 2025-10-02 12:49:07.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:08 np0005466012 nova_compute[192063]: 2025-10-02 12:49:08.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:49:10.521 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:49:11 np0005466012 podman[253517]: 2025-10-02 12:49:11.130530443 +0000 UTC m=+0.047899514 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:49:11 np0005466012 podman[253516]: 2025-10-02 12:49:11.134568923 +0000 UTC m=+0.055675345 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm)
Oct  2 08:49:11 np0005466012 podman[253518]: 2025-10-02 12:49:11.157645201 +0000 UTC m=+0.072636277 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:49:11 np0005466012 podman[253520]: 2025-10-02 12:49:11.163678205 +0000 UTC m=+0.076201064 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 08:49:12 np0005466012 nova_compute[192063]: 2025-10-02 12:49:12.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:13 np0005466012 nova_compute[192063]: 2025-10-02 12:49:13.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:49:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:49:17 np0005466012 nova_compute[192063]: 2025-10-02 12:49:17.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:18 np0005466012 nova_compute[192063]: 2025-10-02 12:49:18.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:22 np0005466012 nova_compute[192063]: 2025-10-02 12:49:22.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:23 np0005466012 nova_compute[192063]: 2025-10-02 12:49:23.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:24 np0005466012 podman[253599]: 2025-10-02 12:49:24.145227635 +0000 UTC m=+0.059025636 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git)
Oct  2 08:49:24 np0005466012 podman[253598]: 2025-10-02 12:49:24.145108842 +0000 UTC m=+0.062076029 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  2 08:49:27 np0005466012 nova_compute[192063]: 2025-10-02 12:49:27.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:28 np0005466012 podman[253637]: 2025-10-02 12:49:28.172832764 +0000 UTC m=+0.090675168 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:49:28 np0005466012 podman[253638]: 2025-10-02 12:49:28.178635242 +0000 UTC m=+0.091092080 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:49:28 np0005466012 nova_compute[192063]: 2025-10-02 12:49:28.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:32 np0005466012 nova_compute[192063]: 2025-10-02 12:49:32.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:33 np0005466012 nova_compute[192063]: 2025-10-02 12:49:33.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:37 np0005466012 nova_compute[192063]: 2025-10-02 12:49:37.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:37 np0005466012 nova_compute[192063]: 2025-10-02 12:49:37.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:38 np0005466012 nova_compute[192063]: 2025-10-02 12:49:38.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:42 np0005466012 podman[253684]: 2025-10-02 12:49:42.141360474 +0000 UTC m=+0.049375134 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:49:42 np0005466012 podman[253683]: 2025-10-02 12:49:42.154636056 +0000 UTC m=+0.063351095 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:49:42 np0005466012 podman[253685]: 2025-10-02 12:49:42.181114546 +0000 UTC m=+0.085631220 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:49:42 np0005466012 podman[253686]: 2025-10-02 12:49:42.210487035 +0000 UTC m=+0.112749309 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:49:42 np0005466012 nova_compute[192063]: 2025-10-02 12:49:42.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:43 np0005466012 nova_compute[192063]: 2025-10-02 12:49:43.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:45 np0005466012 nova_compute[192063]: 2025-10-02 12:49:45.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:47 np0005466012 nova_compute[192063]: 2025-10-02 12:49:47.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:48 np0005466012 nova_compute[192063]: 2025-10-02 12:49:48.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:48 np0005466012 nova_compute[192063]: 2025-10-02 12:49:48.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:52 np0005466012 nova_compute[192063]: 2025-10-02 12:49:52.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:52 np0005466012 nova_compute[192063]: 2025-10-02 12:49:52.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:53 np0005466012 nova_compute[192063]: 2025-10-02 12:49:53.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:53 np0005466012 nova_compute[192063]: 2025-10-02 12:49:53.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:49:54 np0005466012 nova_compute[192063]: 2025-10-02 12:49:53.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:54 np0005466012 nova_compute[192063]: 2025-10-02 12:49:54.857 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:55 np0005466012 podman[253768]: 2025-10-02 12:49:55.13263879 +0000 UTC m=+0.052220581 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:49:55 np0005466012 podman[253769]: 2025-10-02 12:49:55.14148975 +0000 UTC m=+0.057076593 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 08:49:56 np0005466012 nova_compute[192063]: 2025-10-02 12:49:56.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:56 np0005466012 nova_compute[192063]: 2025-10-02 12:49:56.854 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:56 np0005466012 nova_compute[192063]: 2025-10-02 12:49:56.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:56 np0005466012 nova_compute[192063]: 2025-10-02 12:49:56.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:56 np0005466012 nova_compute[192063]: 2025-10-02 12:49:56.855 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.000 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.001 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5738MB free_disk=73.24534225463867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.001 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.001 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.091 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.092 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.116 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.139 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.140 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.140 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:49:57 np0005466012 nova_compute[192063]: 2025-10-02 12:49:57.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:59 np0005466012 nova_compute[192063]: 2025-10-02 12:49:59.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:49:59 np0005466012 nova_compute[192063]: 2025-10-02 12:49:59.140 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:49:59 np0005466012 podman[253809]: 2025-10-02 12:49:59.1613901 +0000 UTC m=+0.082682830 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct  2 08:49:59 np0005466012 podman[253810]: 2025-10-02 12:49:59.161428522 +0000 UTC m=+0.079600747 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:50:00 np0005466012 nova_compute[192063]: 2025-10-02 12:50:00.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:00 np0005466012 nova_compute[192063]: 2025-10-02 12:50:00.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:50:00 np0005466012 nova_compute[192063]: 2025-10-02 12:50:00.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:50:01 np0005466012 nova_compute[192063]: 2025-10-02 12:50:01.199 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:50:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:50:02.165 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:50:02.165 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:50:02.165 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:02 np0005466012 nova_compute[192063]: 2025-10-02 12:50:02.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:03 np0005466012 nova_compute[192063]: 2025-10-02 12:50:03.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:03 np0005466012 nova_compute[192063]: 2025-10-02 12:50:03.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:50:04 np0005466012 nova_compute[192063]: 2025-10-02 12:50:04.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:07 np0005466012 nova_compute[192063]: 2025-10-02 12:50:07.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:09 np0005466012 nova_compute[192063]: 2025-10-02 12:50:09.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:12 np0005466012 nova_compute[192063]: 2025-10-02 12:50:12.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:13 np0005466012 podman[253854]: 2025-10-02 12:50:13.138598756 +0000 UTC m=+0.056177789 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:50:13 np0005466012 podman[253853]: 2025-10-02 12:50:13.144836266 +0000 UTC m=+0.064464655 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct  2 08:50:13 np0005466012 podman[253855]: 2025-10-02 12:50:13.166563647 +0000 UTC m=+0.080264575 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:50:13 np0005466012 podman[253856]: 2025-10-02 12:50:13.188452683 +0000 UTC m=+0.093552597 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  2 08:50:14 np0005466012 nova_compute[192063]: 2025-10-02 12:50:14.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:14 np0005466012 nova_compute[192063]: 2025-10-02 12:50:14.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:50:14.436 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:50:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:50:14.437 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:50:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:50:15.439 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:50:17 np0005466012 nova_compute[192063]: 2025-10-02 12:50:17.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:18 np0005466012 nova_compute[192063]: 2025-10-02 12:50:18.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:18 np0005466012 nova_compute[192063]: 2025-10-02 12:50:18.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:50:18 np0005466012 nova_compute[192063]: 2025-10-02 12:50:18.931 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:50:19 np0005466012 nova_compute[192063]: 2025-10-02 12:50:19.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:22 np0005466012 nova_compute[192063]: 2025-10-02 12:50:22.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:24 np0005466012 nova_compute[192063]: 2025-10-02 12:50:24.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:26 np0005466012 podman[253937]: 2025-10-02 12:50:26.163672111 +0000 UTC m=+0.075410632 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:50:26 np0005466012 podman[253938]: 2025-10-02 12:50:26.173279073 +0000 UTC m=+0.072985616 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Oct  2 08:50:27 np0005466012 nova_compute[192063]: 2025-10-02 12:50:27.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:29 np0005466012 nova_compute[192063]: 2025-10-02 12:50:29.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:30 np0005466012 podman[253976]: 2025-10-02 12:50:30.128640446 +0000 UTC m=+0.040667538 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:50:30 np0005466012 podman[253975]: 2025-10-02 12:50:30.135530133 +0000 UTC m=+0.049880488 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:50:32 np0005466012 nova_compute[192063]: 2025-10-02 12:50:32.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:34 np0005466012 nova_compute[192063]: 2025-10-02 12:50:34.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:37 np0005466012 nova_compute[192063]: 2025-10-02 12:50:37.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:37 np0005466012 nova_compute[192063]: 2025-10-02 12:50:37.931 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:39 np0005466012 nova_compute[192063]: 2025-10-02 12:50:39.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:42 np0005466012 nova_compute[192063]: 2025-10-02 12:50:42.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:44 np0005466012 nova_compute[192063]: 2025-10-02 12:50:44.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:44 np0005466012 podman[254019]: 2025-10-02 12:50:44.145242464 +0000 UTC m=+0.061339260 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:50:44 np0005466012 podman[254020]: 2025-10-02 12:50:44.166740979 +0000 UTC m=+0.080021508 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:50:44 np0005466012 podman[254021]: 2025-10-02 12:50:44.185349864 +0000 UTC m=+0.094966293 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:50:44 np0005466012 podman[254018]: 2025-10-02 12:50:44.192029957 +0000 UTC m=+0.111202347 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:50:46 np0005466012 nova_compute[192063]: 2025-10-02 12:50:46.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:47 np0005466012 nova_compute[192063]: 2025-10-02 12:50:47.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:48 np0005466012 nova_compute[192063]: 2025-10-02 12:50:48.928 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:49 np0005466012 nova_compute[192063]: 2025-10-02 12:50:49.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:49 np0005466012 nova_compute[192063]: 2025-10-02 12:50:49.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:52 np0005466012 nova_compute[192063]: 2025-10-02 12:50:52.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:53 np0005466012 nova_compute[192063]: 2025-10-02 12:50:53.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:54 np0005466012 nova_compute[192063]: 2025-10-02 12:50:54.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:56 np0005466012 nova_compute[192063]: 2025-10-02 12:50:56.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:56 np0005466012 nova_compute[192063]: 2025-10-02 12:50:56.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:50:56 np0005466012 nova_compute[192063]: 2025-10-02 12:50:56.974 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:56 np0005466012 nova_compute[192063]: 2025-10-02 12:50:56.974 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:56 np0005466012 nova_compute[192063]: 2025-10-02 12:50:56.974 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:56 np0005466012 nova_compute[192063]: 2025-10-02 12:50:56.975 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.126 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.127 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5736MB free_disk=73.24534225463867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.127 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.128 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:50:57 np0005466012 podman[254104]: 2025-10-02 12:50:57.148493687 +0000 UTC m=+0.060890178 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 08:50:57 np0005466012 podman[254105]: 2025-10-02 12:50:57.155794546 +0000 UTC m=+0.062596735 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, container_name=openstack_network_exporter)
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.212 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.212 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.295 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.318 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.319 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.320 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:50:57 np0005466012 nova_compute[192063]: 2025-10-02 12:50:57.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:59 np0005466012 nova_compute[192063]: 2025-10-02 12:50:59.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:50:59 np0005466012 nova_compute[192063]: 2025-10-02 12:50:59.320 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:00 np0005466012 nova_compute[192063]: 2025-10-02 12:51:00.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:00 np0005466012 nova_compute[192063]: 2025-10-02 12:51:00.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:51:00 np0005466012 nova_compute[192063]: 2025-10-02 12:51:00.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:51:01 np0005466012 podman[254145]: 2025-10-02 12:51:01.123230618 +0000 UTC m=+0.040571385 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:51:01 np0005466012 podman[254144]: 2025-10-02 12:51:01.129611221 +0000 UTC m=+0.050654649 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct  2 08:51:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:51:02.166 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:51:02.167 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:51:02.167 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:02 np0005466012 nova_compute[192063]: 2025-10-02 12:51:02.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:04 np0005466012 nova_compute[192063]: 2025-10-02 12:51:04.000 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:51:04 np0005466012 nova_compute[192063]: 2025-10-02 12:51:04.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:05 np0005466012 nova_compute[192063]: 2025-10-02 12:51:05.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:05 np0005466012 nova_compute[192063]: 2025-10-02 12:51:05.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:51:06 np0005466012 nova_compute[192063]: 2025-10-02 12:51:06.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:07 np0005466012 nova_compute[192063]: 2025-10-02 12:51:07.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:09 np0005466012 nova_compute[192063]: 2025-10-02 12:51:09.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:12 np0005466012 nova_compute[192063]: 2025-10-02 12:51:12.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:14 np0005466012 nova_compute[192063]: 2025-10-02 12:51:14.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:15 np0005466012 podman[254188]: 2025-10-02 12:51:15.152639832 +0000 UTC m=+0.066394718 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.license=GPLv2)
Oct  2 08:51:15 np0005466012 podman[254189]: 2025-10-02 12:51:15.156461395 +0000 UTC m=+0.063524929 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:51:15 np0005466012 podman[254191]: 2025-10-02 12:51:15.184995761 +0000 UTC m=+0.087666385 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:51:15 np0005466012 podman[254190]: 2025-10-02 12:51:15.189771452 +0000 UTC m=+0.083949685 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:51:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:51:17 np0005466012 nova_compute[192063]: 2025-10-02 12:51:17.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:19 np0005466012 nova_compute[192063]: 2025-10-02 12:51:19.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:22 np0005466012 nova_compute[192063]: 2025-10-02 12:51:22.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:24 np0005466012 nova_compute[192063]: 2025-10-02 12:51:24.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:27 np0005466012 nova_compute[192063]: 2025-10-02 12:51:27.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:28 np0005466012 podman[254272]: 2025-10-02 12:51:28.152626535 +0000 UTC m=+0.064524286 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:51:28 np0005466012 podman[254273]: 2025-10-02 12:51:28.154923368 +0000 UTC m=+0.061675709 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6)
Oct  2 08:51:29 np0005466012 nova_compute[192063]: 2025-10-02 12:51:29.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:32 np0005466012 podman[254315]: 2025-10-02 12:51:32.152938231 +0000 UTC m=+0.063363094 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:51:32 np0005466012 podman[254314]: 2025-10-02 12:51:32.161490023 +0000 UTC m=+0.073345765 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 08:51:32 np0005466012 nova_compute[192063]: 2025-10-02 12:51:32.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:34 np0005466012 nova_compute[192063]: 2025-10-02 12:51:34.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:51:37.273 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:51:37 np0005466012 nova_compute[192063]: 2025-10-02 12:51:37.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:37 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:51:37.274 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:51:37 np0005466012 nova_compute[192063]: 2025-10-02 12:51:37.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:39 np0005466012 nova_compute[192063]: 2025-10-02 12:51:39.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:39 np0005466012 nova_compute[192063]: 2025-10-02 12:51:39.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:42 np0005466012 nova_compute[192063]: 2025-10-02 12:51:42.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:44 np0005466012 nova_compute[192063]: 2025-10-02 12:51:44.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:44 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:51:44.276 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:51:46 np0005466012 podman[254353]: 2025-10-02 12:51:46.146193594 +0000 UTC m=+0.067600701 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:51:46 np0005466012 podman[254355]: 2025-10-02 12:51:46.179076663 +0000 UTC m=+0.088938190 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:51:46 np0005466012 podman[254354]: 2025-10-02 12:51:46.179077884 +0000 UTC m=+0.092794758 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:46 np0005466012 podman[254356]: 2025-10-02 12:51:46.191273991 +0000 UTC m=+0.099151223 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:51:47 np0005466012 nova_compute[192063]: 2025-10-02 12:51:47.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:49 np0005466012 nova_compute[192063]: 2025-10-02 12:51:49.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:49 np0005466012 nova_compute[192063]: 2025-10-02 12:51:49.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:50 np0005466012 nova_compute[192063]: 2025-10-02 12:51:50.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:52 np0005466012 nova_compute[192063]: 2025-10-02 12:51:52.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:54 np0005466012 nova_compute[192063]: 2025-10-02 12:51:54.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:54 np0005466012 nova_compute[192063]: 2025-10-02 12:51:54.816 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:56 np0005466012 nova_compute[192063]: 2025-10-02 12:51:56.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:56 np0005466012 nova_compute[192063]: 2025-10-02 12:51:56.907 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:56 np0005466012 nova_compute[192063]: 2025-10-02 12:51:56.908 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:56 np0005466012 nova_compute[192063]: 2025-10-02 12:51:56.908 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:56 np0005466012 nova_compute[192063]: 2025-10-02 12:51:56.908 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.039 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.040 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5729MB free_disk=73.24532699584961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.040 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.041 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.108 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.109 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.277 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.369 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.370 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.406 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.442 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.475 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.505 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.508 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.508 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:51:57 np0005466012 nova_compute[192063]: 2025-10-02 12:51:57.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:59 np0005466012 podman[254444]: 2025-10-02 12:51:59.146644038 +0000 UTC m=+0.059890867 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7)
Oct  2 08:51:59 np0005466012 podman[254443]: 2025-10-02 12:51:59.184480135 +0000 UTC m=+0.097982461 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:51:59 np0005466012 nova_compute[192063]: 2025-10-02 12:51:59.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:51:59 np0005466012 nova_compute[192063]: 2025-10-02 12:51:59.509 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:51:59 np0005466012 nova_compute[192063]: 2025-10-02 12:51:59.509 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:00 np0005466012 nova_compute[192063]: 2025-10-02 12:52:00.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:00 np0005466012 nova_compute[192063]: 2025-10-02 12:52:00.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:52:00 np0005466012 nova_compute[192063]: 2025-10-02 12:52:00.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:52:00 np0005466012 nova_compute[192063]: 2025-10-02 12:52:00.887 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:52:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:52:02.168 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:52:02.168 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:52:02.169 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:02 np0005466012 nova_compute[192063]: 2025-10-02 12:52:02.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:03 np0005466012 podman[254486]: 2025-10-02 12:52:03.138311247 +0000 UTC m=+0.059333342 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:52:03 np0005466012 podman[254487]: 2025-10-02 12:52:03.13954026 +0000 UTC m=+0.052565794 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:52:04 np0005466012 nova_compute[192063]: 2025-10-02 12:52:04.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:05 np0005466012 nova_compute[192063]: 2025-10-02 12:52:05.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:05 np0005466012 nova_compute[192063]: 2025-10-02 12:52:05.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:52:07 np0005466012 nova_compute[192063]: 2025-10-02 12:52:07.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:09 np0005466012 nova_compute[192063]: 2025-10-02 12:52:09.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:12 np0005466012 nova_compute[192063]: 2025-10-02 12:52:12.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:14 np0005466012 nova_compute[192063]: 2025-10-02 12:52:14.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:17 np0005466012 podman[254542]: 2025-10-02 12:52:17.14760054 +0000 UTC m=+0.056338159 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:52:17 np0005466012 podman[254548]: 2025-10-02 12:52:17.165400472 +0000 UTC m=+0.067139188 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:52:17 np0005466012 podman[254541]: 2025-10-02 12:52:17.173782114 +0000 UTC m=+0.088169749 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:52:17 np0005466012 podman[254554]: 2025-10-02 12:52:17.223640993 +0000 UTC m=+0.118058386 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:52:17 np0005466012 nova_compute[192063]: 2025-10-02 12:52:17.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:19 np0005466012 nova_compute[192063]: 2025-10-02 12:52:19.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:22 np0005466012 nova_compute[192063]: 2025-10-02 12:52:22.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:24 np0005466012 nova_compute[192063]: 2025-10-02 12:52:24.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:25 np0005466012 nova_compute[192063]: 2025-10-02 12:52:25.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:52:25.481 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:52:25 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:52:25.483 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:52:27 np0005466012 nova_compute[192063]: 2025-10-02 12:52:27.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:29 np0005466012 nova_compute[192063]: 2025-10-02 12:52:29.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:30 np0005466012 podman[254622]: 2025-10-02 12:52:30.138910374 +0000 UTC m=+0.051914727 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Oct  2 08:52:30 np0005466012 podman[254621]: 2025-10-02 12:52:30.150457973 +0000 UTC m=+0.067506608 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:52:31 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:52:31.485 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:52:32 np0005466012 nova_compute[192063]: 2025-10-02 12:52:32.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:34 np0005466012 podman[254662]: 2025-10-02 12:52:34.147538891 +0000 UTC m=+0.062390056 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:52:34 np0005466012 podman[254661]: 2025-10-02 12:52:34.183304741 +0000 UTC m=+0.101193140 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:52:34 np0005466012 nova_compute[192063]: 2025-10-02 12:52:34.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:37 np0005466012 nova_compute[192063]: 2025-10-02 12:52:37.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:39 np0005466012 nova_compute[192063]: 2025-10-02 12:52:39.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:40 np0005466012 nova_compute[192063]: 2025-10-02 12:52:40.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:42 np0005466012 nova_compute[192063]: 2025-10-02 12:52:42.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:44 np0005466012 nova_compute[192063]: 2025-10-02 12:52:44.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:47 np0005466012 nova_compute[192063]: 2025-10-02 12:52:47.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:48 np0005466012 podman[254706]: 2025-10-02 12:52:48.151429996 +0000 UTC m=+0.050292912 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible)
Oct  2 08:52:48 np0005466012 podman[254705]: 2025-10-02 12:52:48.151518278 +0000 UTC m=+0.055067353 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 08:52:48 np0005466012 podman[254707]: 2025-10-02 12:52:48.156391983 +0000 UTC m=+0.053901251 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:52:48 np0005466012 podman[254708]: 2025-10-02 12:52:48.189475838 +0000 UTC m=+0.077736811 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:52:49 np0005466012 nova_compute[192063]: 2025-10-02 12:52:49.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:50 np0005466012 nova_compute[192063]: 2025-10-02 12:52:50.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:51 np0005466012 nova_compute[192063]: 2025-10-02 12:52:51.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:52 np0005466012 nova_compute[192063]: 2025-10-02 12:52:52.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:54 np0005466012 nova_compute[192063]: 2025-10-02 12:52:54.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:56 np0005466012 nova_compute[192063]: 2025-10-02 12:52:56.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:56 np0005466012 nova_compute[192063]: 2025-10-02 12:52:56.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.086 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.087 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.087 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.087 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.217 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.218 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5743MB free_disk=73.22969818115234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.218 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.219 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:52:57 np0005466012 nova_compute[192063]: 2025-10-02 12:52:57.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:52:58 np0005466012 nova_compute[192063]: 2025-10-02 12:52:58.230 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:52:58 np0005466012 nova_compute[192063]: 2025-10-02 12:52:58.230 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:52:58 np0005466012 nova_compute[192063]: 2025-10-02 12:52:58.291 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:52:58 np0005466012 nova_compute[192063]: 2025-10-02 12:52:58.428 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:52:58 np0005466012 nova_compute[192063]: 2025-10-02 12:52:58.430 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:52:58 np0005466012 nova_compute[192063]: 2025-10-02 12:52:58.430 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:52:59 np0005466012 nova_compute[192063]: 2025-10-02 12:52:59.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:00 np0005466012 nova_compute[192063]: 2025-10-02 12:53:00.431 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:00 np0005466012 nova_compute[192063]: 2025-10-02 12:53:00.432 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:01 np0005466012 podman[254792]: 2025-10-02 12:53:01.162653739 +0000 UTC m=+0.076376643 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7)
Oct  2 08:53:01 np0005466012 podman[254791]: 2025-10-02 12:53:01.162850835 +0000 UTC m=+0.072044394 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:53:01 np0005466012 nova_compute[192063]: 2025-10-02 12:53:01.824 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:01 np0005466012 nova_compute[192063]: 2025-10-02 12:53:01.825 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:53:01 np0005466012 nova_compute[192063]: 2025-10-02 12:53:01.825 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:53:01 np0005466012 nova_compute[192063]: 2025-10-02 12:53:01.850 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:53:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:02.170 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:02.170 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:02.170 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:02 np0005466012 nova_compute[192063]: 2025-10-02 12:53:02.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:04 np0005466012 nova_compute[192063]: 2025-10-02 12:53:04.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:05 np0005466012 podman[254836]: 2025-10-02 12:53:05.131430704 +0000 UTC m=+0.052376129 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid)
Oct  2 08:53:05 np0005466012 podman[254837]: 2025-10-02 12:53:05.150025739 +0000 UTC m=+0.058211961 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:53:07 np0005466012 nova_compute[192063]: 2025-10-02 12:53:07.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:07 np0005466012 nova_compute[192063]: 2025-10-02 12:53:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:07 np0005466012 nova_compute[192063]: 2025-10-02 12:53:07.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.284 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.284 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.308 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.444 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.445 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.450 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.451 2 INFO nova.compute.claims [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.607 2 DEBUG nova.compute.provider_tree [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.638 2 DEBUG nova.scheduler.client.report [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.696 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.697 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.906 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.907 2 DEBUG nova.network.neutron [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.931 2 INFO nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  2 08:53:08 np0005466012 nova_compute[192063]: 2025-10-02 12:53:08.952 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.079 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.081 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.082 2 INFO nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Creating image(s)#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.083 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "/var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.083 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "/var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.084 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "/var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.109 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.207 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.208 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "068b233e8d7f49e215e2900dde7d25b776cad955" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.209 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.226 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.281 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.282 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.318 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955,backing_fmt=raw /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.319 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "068b233e8d7f49e215e2900dde7d25b776cad955" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.319 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.376 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/068b233e8d7f49e215e2900dde7d25b776cad955 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.377 2 DEBUG nova.virt.disk.api [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Checking if we can resize image /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.378 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.433 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.434 2 DEBUG nova.virt.disk.api [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Cannot resize image /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.435 2 DEBUG nova.objects.instance [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lazy-loading 'migration_context' on Instance uuid 05e968d6-dd2c-4863-88ee-98c84f0714a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.456 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.457 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Ensure instance console log exists: /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.457 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.458 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.458 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:09 np0005466012 nova_compute[192063]: 2025-10-02 12:53:09.834 2 DEBUG nova.policy [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  2 08:53:11 np0005466012 nova_compute[192063]: 2025-10-02 12:53:11.289 2 DEBUG nova.network.neutron [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Successfully created port: 4e9e40d8-e1ed-437a-b254-ed4613949c85 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  2 08:53:11 np0005466012 nova_compute[192063]: 2025-10-02 12:53:11.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:11 np0005466012 nova_compute[192063]: 2025-10-02 12:53:11.922 2 DEBUG nova.network.neutron [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Successfully updated port: 4e9e40d8-e1ed-437a-b254-ed4613949c85 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  2 08:53:11 np0005466012 nova_compute[192063]: 2025-10-02 12:53:11.939 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:53:11 np0005466012 nova_compute[192063]: 2025-10-02 12:53:11.939 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquired lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:53:11 np0005466012 nova_compute[192063]: 2025-10-02 12:53:11.939 2 DEBUG nova.network.neutron [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  2 08:53:12 np0005466012 nova_compute[192063]: 2025-10-02 12:53:12.033 2 DEBUG nova.compute.manager [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-changed-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:12 np0005466012 nova_compute[192063]: 2025-10-02 12:53:12.036 2 DEBUG nova.compute.manager [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Refreshing instance network info cache due to event network-changed-4e9e40d8-e1ed-437a-b254-ed4613949c85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:53:12 np0005466012 nova_compute[192063]: 2025-10-02 12:53:12.036 2 DEBUG oslo_concurrency.lockutils [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:53:12 np0005466012 nova_compute[192063]: 2025-10-02 12:53:12.064 2 DEBUG nova.network.neutron [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  2 08:53:12 np0005466012 nova_compute[192063]: 2025-10-02 12:53:12.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.858 2 DEBUG nova.network.neutron [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Updating instance_info_cache with network_info: [{"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.900 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Releasing lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.901 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Instance network_info: |[{"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.902 2 DEBUG oslo_concurrency.lockutils [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.902 2 DEBUG nova.network.neutron [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Refreshing network info cache for port 4e9e40d8-e1ed-437a-b254-ed4613949c85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.908 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Start _get_guest_xml network_info=[{"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.914 2 WARNING nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.919 2 DEBUG nova.virt.libvirt.host [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.920 2 DEBUG nova.virt.libvirt.host [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.923 2 DEBUG nova.virt.libvirt.host [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.924 2 DEBUG nova.virt.libvirt.host [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.926 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.927 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T11:59:24Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9ac83da7-f31e-4467-8569-d28002f6aeed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T11:59:26Z,direct_url=<?>,disk_format='qcow2',id=cf60d86d-f1d5-4be4-976e-7488dbdcf0b2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c543175414e2485bb476e4dfce01c394',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T11:59:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.927 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.928 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.928 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.929 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.929 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.930 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.930 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.931 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.931 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.931 2 DEBUG nova.virt.hardware [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.938 2 DEBUG nova.virt.libvirt.vif [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1534615901',display_name='tempest-TestServerBasicOps-server-1534615901',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1534615901',id=188,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCaD7GQniYdoYwp35Nm6IfKAXjpBPBSCums2KeZ3fzXUrXjawEwxG/MjwuGDcNwXUee8EuobdYpMCam8FfxtemP8uY+pHbYNGzGD4zZl5C5madLQvYGt4xkcnsH6GF70dA==',key_name='tempest-TestServerBasicOps-349026518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='449520c8b1da43f1b24a84c08baac03b',ramdisk_id='',reservation_id='r-e8sb2g07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1990510451',owner_user_name='tempest-TestServerBasicOps-1990510451-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:53:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25035a2aea694d7abea1da6e2dc97fd9',uuid=05e968d6-dd2c-4863-88ee-98c84f0714a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.939 2 DEBUG nova.network.os_vif_util [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Converting VIF {"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.940 2 DEBUG nova.network.os_vif_util [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.942 2 DEBUG nova.objects.instance [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lazy-loading 'pci_devices' on Instance uuid 05e968d6-dd2c-4863-88ee-98c84f0714a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.964 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] End _get_guest_xml xml=<domain type="kvm">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <uuid>05e968d6-dd2c-4863-88ee-98c84f0714a1</uuid>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <name>instance-000000bc</name>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <memory>131072</memory>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <vcpu>1</vcpu>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <metadata>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:name>tempest-TestServerBasicOps-server-1534615901</nova:name>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:creationTime>2025-10-02 12:53:13</nova:creationTime>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:flavor name="m1.nano">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:memory>128</nova:memory>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:disk>1</nova:disk>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:swap>0</nova:swap>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:ephemeral>0</nova:ephemeral>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:vcpus>1</nova:vcpus>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      </nova:flavor>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:owner>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:user uuid="25035a2aea694d7abea1da6e2dc97fd9">tempest-TestServerBasicOps-1990510451-project-member</nova:user>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:project uuid="449520c8b1da43f1b24a84c08baac03b">tempest-TestServerBasicOps-1990510451</nova:project>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      </nova:owner>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:root type="image" uuid="cf60d86d-f1d5-4be4-976e-7488dbdcf0b2"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <nova:ports>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        <nova:port uuid="4e9e40d8-e1ed-437a-b254-ed4613949c85">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:        </nova:port>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      </nova:ports>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </nova:instance>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </metadata>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <sysinfo type="smbios">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <system>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <entry name="manufacturer">RDO</entry>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <entry name="product">OpenStack Compute</entry>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <entry name="serial">05e968d6-dd2c-4863-88ee-98c84f0714a1</entry>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <entry name="uuid">05e968d6-dd2c-4863-88ee-98c84f0714a1</entry>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <entry name="family">Virtual Machine</entry>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </system>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </sysinfo>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <os>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <boot dev="hd"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <smbios mode="sysinfo"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </os>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <features>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <acpi/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <apic/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <vmcoreinfo/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </features>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <clock offset="utc">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <timer name="pit" tickpolicy="delay"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <timer name="hpet" present="no"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </clock>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <cpu mode="custom" match="exact">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <model>Nehalem</model>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <topology sockets="1" cores="1" threads="1"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </cpu>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  <devices>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <disk type="file" device="disk">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <target dev="vda" bus="virtio"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <disk type="file" device="cdrom">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <driver name="qemu" type="raw" cache="none"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <source file="/var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.config"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <target dev="sda" bus="sata"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </disk>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <interface type="ethernet">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <mac address="fa:16:3e:e1:89:0d"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <driver name="vhost" rx_queue_size="512"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <mtu size="1442"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <target dev="tap4e9e40d8-e1"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </interface>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <serial type="pty">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <log file="/var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/console.log" append="off"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </serial>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <video>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <model type="virtio"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </video>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <input type="tablet" bus="usb"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <rng model="virtio">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <backend model="random">/dev/urandom</backend>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </rng>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="pci" model="pcie-root-port"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <controller type="usb" index="0"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    <memballoon model="virtio">
Oct  2 08:53:13 np0005466012 nova_compute[192063]:      <stats period="10"/>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:    </memballoon>
Oct  2 08:53:13 np0005466012 nova_compute[192063]:  </devices>
Oct  2 08:53:13 np0005466012 nova_compute[192063]: </domain>
Oct  2 08:53:13 np0005466012 nova_compute[192063]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.965 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Preparing to wait for external event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.965 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.966 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.966 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.966 2 DEBUG nova.virt.libvirt.vif [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1534615901',display_name='tempest-TestServerBasicOps-server-1534615901',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1534615901',id=188,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCaD7GQniYdoYwp35Nm6IfKAXjpBPBSCums2KeZ3fzXUrXjawEwxG/MjwuGDcNwXUee8EuobdYpMCam8FfxtemP8uY+pHbYNGzGD4zZl5C5madLQvYGt4xkcnsH6GF70dA==',key_name='tempest-TestServerBasicOps-349026518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='449520c8b1da43f1b24a84c08baac03b',ramdisk_id='',reservation_id='r-e8sb2g07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1990510451',owner_user_name='tempest-TestServerBasicOps-1990510451-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T12:53:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25035a2aea694d7abea1da6e2dc97fd9',uuid=05e968d6-dd2c-4863-88ee-98c84f0714a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.967 2 DEBUG nova.network.os_vif_util [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Converting VIF {"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.967 2 DEBUG nova.network.os_vif_util [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.968 2 DEBUG os_vif [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.969 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4e9e40d8-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4e9e40d8-e1, col_values=(('external_ids', {'iface-id': '4e9e40d8-e1ed-437a-b254-ed4613949c85', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:89:0d', 'vm-uuid': '05e968d6-dd2c-4863-88ee-98c84f0714a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005466012 NetworkManager[51207]: <info>  [1759409593.9748] manager: (tap4e9e40d8-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:13 np0005466012 nova_compute[192063]: 2025-10-02 12:53:13.982 2 INFO os_vif [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1')#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.030 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.031 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.031 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] No VIF found with MAC fa:16:3e:e1:89:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.031 2 INFO nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Using config drive#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.696 2 INFO nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Creating config drive at /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.config#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.702 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczu2ojui execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.826 2 DEBUG oslo_concurrency.processutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczu2ojui" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:53:14 np0005466012 kernel: tap4e9e40d8-e1: entered promiscuous mode
Oct  2 08:53:14 np0005466012 NetworkManager[51207]: <info>  [1759409594.8960] manager: (tap4e9e40d8-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/356)
Oct  2 08:53:14 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:14Z|00755|binding|INFO|Claiming lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 for this chassis.
Oct  2 08:53:14 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:14Z|00756|binding|INFO|4e9e40d8-e1ed-437a-b254-ed4613949c85: Claiming fa:16:3e:e1:89:0d 10.100.0.3
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.915 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:89:0d 10.100.0.3'], port_security=['fa:16:3e:e1:89:0d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '449520c8b1da43f1b24a84c08baac03b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '723f044d-15a6-4f16-bc64-3542e5b28423 ea59696b-608e-4561-9861-f0c3334aea82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcf2980-98fd-43bf-91c9-1637efee9e0a, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4e9e40d8-e1ed-437a-b254-ed4613949c85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.916 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4e9e40d8-e1ed-437a-b254-ed4613949c85 in datapath 453b9882-4bf3-448c-bd9c-f6befe8aa4a6 bound to our chassis#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.918 103246 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 453b9882-4bf3-448c-bd9c-f6befe8aa4a6#033[00m
Oct  2 08:53:14 np0005466012 systemd-machined[152114]: New machine qemu-83-instance-000000bc.
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.938 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[08d622bd-4512-455f-91cf-bfd179e779d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.939 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap453b9882-41 in ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.943 219792 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap453b9882-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.944 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[75046f99-a6fd-4944-a826-1c1a27009557]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.944 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe63c83-90ce-491e-a4d9-0659d9817e39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:14 np0005466012 systemd-udevd[254916]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:14 np0005466012 systemd[1]: Started Virtual Machine qemu-83-instance-000000bc.
Oct  2 08:53:14 np0005466012 NetworkManager[51207]: <info>  [1759409594.9623] device (tap4e9e40d8-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  2 08:53:14 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:14Z|00757|binding|INFO|Setting lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 ovn-installed in OVS
Oct  2 08:53:14 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:14Z|00758|binding|INFO|Setting lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 up in Southbound
Oct  2 08:53:14 np0005466012 NetworkManager[51207]: <info>  [1759409594.9635] device (tap4e9e40d8-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.963 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[89e1f7c4-f550-453f-ac5d-2af26911506f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:14 np0005466012 nova_compute[192063]: 2025-10-02 12:53:14.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:14 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:14.980 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf7ff2b-c90c-4f10-b501-a4a3417142b1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.022 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[e4860c4d-93b3-4a1a-a858-cdb0215b2ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 NetworkManager[51207]: <info>  [1759409595.0286] manager: (tap453b9882-40): new Veth device (/org/freedesktop/NetworkManager/Devices/357)
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.027 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed73dcd-6a0e-4290-952a-8a8a80109da6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 systemd-udevd[254919]: Network interface NamePolicy= disabled on kernel command line.
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.062 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[a84c840b-d183-4ca1-ad3c-bb2fd849ad42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.067 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[33bac567-268a-41c9-9840-5a16211ff9d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 NetworkManager[51207]: <info>  [1759409595.0927] device (tap453b9882-40): carrier: link connected
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.098 219806 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc6168d-011c-4ff9-aa16-53fcefc7ee52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.117 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[7781e729-2df9-4a31-acb0-3de2dfd9f038]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap453b9882-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:33:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758870, 'reachable_time': 41102, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254948, 'error': None, 'target': 'ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.136 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[463ebf45-818d-4e06-b606-a618687ed683]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec1:33da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 758870, 'tstamp': 758870}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254949, 'error': None, 'target': 'ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.155 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[63827ce3-5eb2-4dc1-bc92-0e1434f750ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap453b9882-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:33:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758870, 'reachable_time': 41102, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254950, 'error': None, 'target': 'ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.188 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[4dfe4f08-d8ea-4b92-b284-40f5ea878560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.250 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[22f76b40-6d18-4a26-9f07-72cfe289019a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.251 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap453b9882-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.252 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.252 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap453b9882-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:15 np0005466012 NetworkManager[51207]: <info>  [1759409595.2555] manager: (tap453b9882-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:15 np0005466012 kernel: tap453b9882-40: entered promiscuous mode
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.261 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap453b9882-40, col_values=(('external_ids', {'iface-id': 'e01c977b-4b97-4519-9e0d-7bee60893894'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:15 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:15Z|00759|binding|INFO|Releasing lport e01c977b-4b97-4519-9e0d-7bee60893894 from this chassis (sb_readonly=0)
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.276 103246 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/453b9882-4bf3-448c-bd9c-f6befe8aa4a6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/453b9882-4bf3-448c-bd9c-f6befe8aa4a6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.277 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c79b66-ab85-4d6a-a1b6-2c02d1518310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.278 103246 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: global
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    log         /dev/log local0 debug
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    log-tag     haproxy-metadata-proxy-453b9882-4bf3-448c-bd9c-f6befe8aa4a6
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    user        root
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    group       root
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    maxconn     1024
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    pidfile     /var/lib/neutron/external/pids/453b9882-4bf3-448c-bd9c-f6befe8aa4a6.pid.haproxy
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    daemon
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: defaults
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    log global
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    mode http
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    option httplog
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    option dontlognull
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    option http-server-close
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    option forwardfor
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    retries                 3
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    timeout http-request    30s
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    timeout connect         30s
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    timeout client          32s
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    timeout server          32s
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    timeout http-keep-alive 30s
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: listen listener
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    bind 169.254.169.254:80
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    server metadata /var/lib/neutron/metadata_proxy
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]:    http-request add-header X-OVN-Network-ID 453b9882-4bf3-448c-bd9c-f6befe8aa4a6
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  2 08:53:15 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:15.280 103246 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'env', 'PROCESS_TAG=haproxy-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/453b9882-4bf3-448c-bd9c-f6befe8aa4a6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.281 2 DEBUG nova.compute.manager [req-2106b0c2-1291-4bba-8b2c-ad33accc997f req-5f417b83-9430-4ff8-91ed-8dc511e3e15b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.282 2 DEBUG oslo_concurrency.lockutils [req-2106b0c2-1291-4bba-8b2c-ad33accc997f req-5f417b83-9430-4ff8-91ed-8dc511e3e15b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.282 2 DEBUG oslo_concurrency.lockutils [req-2106b0c2-1291-4bba-8b2c-ad33accc997f req-5f417b83-9430-4ff8-91ed-8dc511e3e15b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.282 2 DEBUG oslo_concurrency.lockutils [req-2106b0c2-1291-4bba-8b2c-ad33accc997f req-5f417b83-9430-4ff8-91ed-8dc511e3e15b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.282 2 DEBUG nova.compute.manager [req-2106b0c2-1291-4bba-8b2c-ad33accc997f req-5f417b83-9430-4ff8-91ed-8dc511e3e15b 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Processing event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  2 08:53:15 np0005466012 podman[254987]: 2025-10-02 12:53:15.662289924 +0000 UTC m=+0.075121049 container create a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 08:53:15 np0005466012 systemd[1]: Started libpod-conmon-a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8.scope.
Oct  2 08:53:15 np0005466012 podman[254987]: 2025-10-02 12:53:15.610333917 +0000 UTC m=+0.023165082 image pull df4949fbbe269ec91c503c0c2a01f0407aa671cfac804c078bc791d1efed5574 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  2 08:53:15 np0005466012 systemd[1]: Started libcrun container.
Oct  2 08:53:15 np0005466012 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c70a8ee4355f0a0807f2d8c4e9e98bb07ed538b6de0662290ece3d2b7966db9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  2 08:53:15 np0005466012 podman[254987]: 2025-10-02 12:53:15.733416891 +0000 UTC m=+0.146248036 container init a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:53:15 np0005466012 podman[254987]: 2025-10-02 12:53:15.738565593 +0000 UTC m=+0.151396718 container start a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:53:15 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [NOTICE]   (255007) : New worker (255009) forked
Oct  2 08:53:15 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [NOTICE]   (255007) : Loading success.
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.807 2 DEBUG nova.network.neutron [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Updated VIF entry in instance network info cache for port 4e9e40d8-e1ed-437a-b254-ed4613949c85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.808 2 DEBUG nova.network.neutron [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Updating instance_info_cache with network_info: [{"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.838 2 DEBUG oslo_concurrency.lockutils [req-fa33dc50-e242-4275-8a03-62470a6e2997 req-44ab5ec6-207d-4947-b291-f2389054c857 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.869 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409595.8692336, 05e968d6-dd2c-4863-88ee-98c84f0714a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.869 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] VM Started (Lifecycle Event)#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.871 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.874 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.877 2 INFO nova.virt.libvirt.driver [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Instance spawned successfully.#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.877 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.894 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.899 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.902 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.902 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.902 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.903 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.903 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.903 2 DEBUG nova.virt.libvirt.driver [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.936 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.936 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409595.8693147, 05e968d6-dd2c-4863-88ee-98c84f0714a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.936 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] VM Paused (Lifecycle Event)#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.968 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.971 2 DEBUG nova.virt.driver [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] Emitting event <LifecycleEvent: 1759409595.8736181, 05e968d6-dd2c-4863-88ee-98c84f0714a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.971 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] VM Resumed (Lifecycle Event)#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.996 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.999 2 INFO nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Took 6.92 seconds to spawn the instance on the hypervisor.#033[00m
Oct  2 08:53:15 np0005466012 nova_compute[192063]: 2025-10-02 12:53:15.999 2 DEBUG nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:53:16 np0005466012 nova_compute[192063]: 2025-10-02 12:53:16.001 2 DEBUG nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  2 08:53:16 np0005466012 nova_compute[192063]: 2025-10-02 12:53:16.032 2 INFO nova.compute.manager [None req-05302524-1ef5-4eab-99a6-7f8158f40a49 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  2 08:53:16 np0005466012 nova_compute[192063]: 2025-10-02 12:53:16.091 2 INFO nova.compute.manager [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Took 7.70 seconds to build instance.#033[00m
Oct  2 08:53:16 np0005466012 nova_compute[192063]: 2025-10-02 12:53:16.109 2 DEBUG oslo_concurrency.lockutils [None req-b4ddc9fb-6745-4762-a1b8-b93bfb794e12 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.931 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'name': 'tempest-TestServerBasicOps-server-1534615901', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000bc', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '449520c8b1da43f1b24a84c08baac03b', 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'hostId': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.947 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.948 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fef28c7-9412-4f2b-a090-edaf373a4745', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:16.932754', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3e8ff06-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': 'b31403a1f67fe9470dc9069cb909ab4d65751dff17b9989125ce9ea70f20adf5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:16.932754', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3e9118a-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '4b5341249601c3bb69884ed257a61b7b13c2ae4bb7fe5a4e2f86e02ac42a9a52'}]}, 'timestamp': '2025-10-02 12:53:16.949089', '_unique_id': '4486212494354df4a316b0ae01906ceb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.950 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.951 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.965 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.965 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 05e968d6-dd2c-4863-88ee-98c84f0714a1: ceilometer.compute.pollsters.NoVolumeException
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.967 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 05e968d6-dd2c-4863-88ee-98c84f0714a1 / tap4e9e40d8-e1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.967 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67edbfb4-0d1c-4f06-8eb7-867d37939b7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:16.965735', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3ebffee-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '3d9f69c7bd065c69bd40ff901f40c2cc692d751c4b5a4324c1b3d7db38cb8454'}]}, 'timestamp': '2025-10-02 12:53:16.968355', '_unique_id': '28bbd4b3621047408f324aed21f58a66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.969 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.971 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f9dd368-4cd1-4773-85b3-6d8bcc817f51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:16.970987', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3ec77a8-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': 'ebc5514ba31b136f1ab3b94df7feae47ebab40bdbbed8d3b7802c61bee94ba41'}]}, 'timestamp': '2025-10-02 12:53:16.971362', '_unique_id': 'ebabd8301d584bf19d1012fc56062414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.972 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.973 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.read.requests volume: 585 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.973 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbcd3c44-6d59-4a3c-9142-4156b4439ffe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 585, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:16.973403', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3ecd5cc-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '77c092c822a4aeeb48f372bd5b0e55b50f5c7daee4d7c24287422b1a31571169'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:16.973403', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3ece378-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '35dbdb28df475f19746130149c20b3f2422f0814874c38a115c5e62a9c61cc9e'}]}, 'timestamp': '2025-10-02 12:53:16.974100', '_unique_id': '1bc376d907cc4cc98206795a6464cc63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.974 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.976 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff0b6f84-5607-42e0-98fd-98571edcd3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:16.976193', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3ed42b4-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '3e8d22658177b978b1a86173e7f868de21f6da5ff1cc5534a44d15ac1d08b689'}]}, 'timestamp': '2025-10-02 12:53:16.976553', '_unique_id': '3219e2fcbc094a6c90caf59aa4bb3486'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.977 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.978 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '772140e4-8927-47b2-9e52-ff34175194bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:16.978444', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3ed9ab6-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '3f6a2a13d15375a2b9ec330c312815cd96276717a687b6b0d20369d0a877b948'}]}, 'timestamp': '2025-10-02 12:53:16.978828', '_unique_id': '1b310c28d7a744d3bcba7263e50e3574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.979 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.980 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.read.latency volume: 290257498 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.981 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.read.latency volume: 3250790 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f3aa0ba-232b-470d-ba44-c308e612d5ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 290257498, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:16.980797', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3edf6a0-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '33aa08e70cb7a407fd45d3a8b3505a4e672ff6dd0bab7398efb2def04ee1d6b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3250790, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:16.980797', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3ee033e-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '38606e842e1563705eeace41d1fe9d904c5c68cae39cb7d9e442c161ace345ae'}]}, 'timestamp': '2025-10-02 12:53:16.981466', '_unique_id': 'dc4c47ba4f0041a58d4c45ad02f6dc2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.982 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.983 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.983 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c729ceb-07a3-4e4f-8ec4-bab2dca7e0c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:16.983475', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3ee60d6-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '3996243b0f6315f71d7633b26599f6f27af03004a16a28959619e27304973bdb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:16.983475', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3ee6d9c-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '284088a7eb20af8389d953847164fac23acdd82ff53df9c53022870dbee2e9a4'}]}, 'timestamp': '2025-10-02 12:53:16.984190', '_unique_id': 'd6908cc054bd48e28b86f899ea32b4ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.984 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.986 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ba165ca-c5d5-4aa0-9024-04c6ca99f38e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:16.986123', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3eec6fc-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '9e944da79bccaff17f72dad29fa1fd12b4f9d3738aacd89db8e98baba2e07c96'}]}, 'timestamp': '2025-10-02 12:53:16.986491', '_unique_id': '3bc2129f8b604138aded46d64b4dfe35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.987 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.988 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '628f503c-9598-41e2-9726-6495ba6ef2f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:16.988423', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3ef2048-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '15eeb8e99254978442c7b314bd58e267969faf6d797cb9e4abdb45196a0e41bd'}]}, 'timestamp': '2025-10-02 12:53:16.988797', '_unique_id': 'e180edfca06f42a6bff358cd86cd29c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.989 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:16.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.001 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.002 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22f6e516-ceb8-4642-919c-0c3f55bff5a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:16.990664', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f1264a-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.667076203, 'message_signature': 'c4bacaae5531cbdea63832b232e487e31e684b6d7a0a4f78564946557d135ad1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:16.990664', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f13158-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.667076203, 'message_signature': 'b063b01f6ea6f515b4b60a45b660eb1c37e000c298db7ba72a7c6cf936e5528c'}]}, 'timestamp': '2025-10-02 12:53:17.002277', '_unique_id': '68bb4ae2821d4b01acd0dd78866d18f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.003 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.004 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59340d46-585e-4113-83fc-b12745950e32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:17.004285', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3f18bee-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': 'bdf893ef6d9b24bf020828549374eaac72842f59dd882bd4cfcdb403ad190ecd'}]}, 'timestamp': '2025-10-02 12:53:17.004605', '_unique_id': '548b74f5478e46dd911a214d8e641c45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.005 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.006 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.006 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a42bb520-a1bf-48a0-9c5b-112f0df4fcfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:17.006242', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f1d676-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.667076203, 'message_signature': '16aa956f36ffd9c026ac462883903f56bb4edbd82380619689b81c5454bbba5a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:17.006242', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f1dfd6-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.667076203, 'message_signature': '8b9fa3ef77a54de63047cb12ce52949743d98ffa1e185a0f7d86f2196b37bb6b'}]}, 'timestamp': '2025-10-02 12:53:17.006756', '_unique_id': 'c28245d6c19e4be1a8895f1b7ceec794'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.007 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.008 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '943baa58-46b0-4da1-90bb-07c382910574', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:17.008268', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3f225c2-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '584762734d6b3483aa5af7deffa7b8ffe5c6acd04c1dcd34dcc2302be3409d9b'}]}, 'timestamp': '2025-10-02 12:53:17.008569', '_unique_id': 'f5b6b777849e4aa9bbddda1ebbea0241'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.010 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.010 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>]
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.010 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.010 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/cpu volume: 1020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeeed17c-6946-400f-9d90-3bee37e4f832', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1020000000, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'timestamp': '2025-10-02T12:53:17.010580', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c3f280c6-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.641281519, 'message_signature': '4f0a22ba9c5dea1b960d9e77169471aad8431f62ff141d9fe585f62d86b27653'}]}, 'timestamp': '2025-10-02 12:53:17.010895', '_unique_id': '85187cd884f442a8b04e37acc40f5042'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.011 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.012 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.012 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f787f7f3-cd6c-4c11-8122-4a57f2a7b4ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:17.012460', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f2ca18-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '8f50638e5ced06c64c4ac92eb18a1ae799a04c1286dece26134372da568fd95a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:17.012460', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f2d422-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '6c368c5a622eec1630fa2f16801afb7cd4cd90f99a69ef7c972b29d225589cec'}]}, 'timestamp': '2025-10-02 12:53:17.012988', '_unique_id': '51407c7d34604fff9ad9af91487c0568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.013 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.014 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.014 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>]
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b871166-3572-4ca2-8f39-783ef1d659e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:17.015114', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3f332b4-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '6d62613c092ecd77a46a001775f612f2e6ed606de1eaaf076122be65cb606add'}]}, 'timestamp': '2025-10-02 12:53:17.015445', '_unique_id': '6f9d691d58174b91a020696bc0018093'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.015 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.017 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.017 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd533520-bef7-424b-9081-bc289f1bab8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:17.016982', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f37abc-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.667076203, 'message_signature': '89485d91d2d2819dfa738fb40de1aa46ede1372a676883671ad7dd983c3519e3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:17.016982', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f38462-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.667076203, 'message_signature': 'ce79b2b72882893378bf9b92f94b845bccffaacc3cd64cf843e34c20f31d4b6a'}]}, 'timestamp': '2025-10-02 12:53:17.017512', '_unique_id': '60c2098e6f674fe2a7f76aececd7fe56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.018 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.019 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96473137-6484-40c8-847b-1855f24ff475', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': 'instance-000000bc-05e968d6-dd2c-4863-88ee-98c84f0714a1-tap4e9e40d8-e1', 'timestamp': '2025-10-02T12:53:17.019214', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'tap4e9e40d8-e1', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:89:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e9e40d8-e1'}, 'message_id': 'c3f3d17e-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.642106832, 'message_signature': '59ce2653adfaef7285a7035bd9879f8a750b6f63f0b7d454d1d517adae89c3b0'}]}, 'timestamp': '2025-10-02 12:53:17.019492', '_unique_id': '82c07397db7a4ec8b95b7ca700e2c6a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.020 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.021 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.read.bytes volume: 18130432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.021 12 DEBUG ceilometer.compute.pollsters [-] 05e968d6-dd2c-4863-88ee-98c84f0714a1/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3371188e-d686-42be-92c1-4cc63e490be2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18130432, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-vda', 'timestamp': '2025-10-02T12:53:17.021105', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f41ada-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': 'f4eda595e375dcdc3588fd8c280f7737870bad5585337f7faa9b8f54a0d7f299'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '25035a2aea694d7abea1da6e2dc97fd9', 'user_name': None, 'project_id': '449520c8b1da43f1b24a84c08baac03b', 'project_name': None, 'resource_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1-sda', 'timestamp': '2025-10-02T12:53:17.021105', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1534615901', 'name': 'instance-000000bc', 'instance_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'instance_type': 'm1.nano', 'host': '56432c54a1df6c5fbcdd0ddbc1476a81e47f8c92a6a8bb91d3ce76be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9ac83da7-f31e-4467-8569-d28002f6aeed', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2'}, 'image_ref': 'cf60d86d-f1d5-4be4-976e-7488dbdcf0b2', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f42638-9f8e-11f0-b6ee-fa163e01ba27', 'monotonic_time': 7590.60912505, 'message_signature': '1a05d6db29660fe77a15f1e9286c5ceccf0f7c020fbcb5c3f7d07b5e5bb61af8'}]}, 'timestamp': '2025-10-02 12:53:17.021658', '_unique_id': '2b87a0da00bc42d5a70d923c51f7f098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     yield
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.022 12 ERROR oslo_messaging.notify.messaging 
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.023 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.023 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>]
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.023 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  2 08:53:17 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:53:17.023 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1534615901>]
Oct  2 08:53:17 np0005466012 NetworkManager[51207]: <info>  [1759409597.3002] manager: (patch-br-int-to-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Oct  2 08:53:17 np0005466012 NetworkManager[51207]: <info>  [1759409597.3011] manager: (patch-provnet-9e5e5ef3-dd6f-48b8-9d1f-8f15cf85bb3d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:17 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:17Z|00760|binding|INFO|Releasing lport e01c977b-4b97-4519-9e0d-7bee60893894 from this chassis (sb_readonly=0)
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.412 2 DEBUG nova.compute.manager [req-6464fca3-3c74-4d0c-8e7b-82f54f11279a req-48b95120-e4e3-4de0-9376-716a76b1faff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.412 2 DEBUG oslo_concurrency.lockutils [req-6464fca3-3c74-4d0c-8e7b-82f54f11279a req-48b95120-e4e3-4de0-9376-716a76b1faff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.412 2 DEBUG oslo_concurrency.lockutils [req-6464fca3-3c74-4d0c-8e7b-82f54f11279a req-48b95120-e4e3-4de0-9376-716a76b1faff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.413 2 DEBUG oslo_concurrency.lockutils [req-6464fca3-3c74-4d0c-8e7b-82f54f11279a req-48b95120-e4e3-4de0-9376-716a76b1faff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.413 2 DEBUG nova.compute.manager [req-6464fca3-3c74-4d0c-8e7b-82f54f11279a req-48b95120-e4e3-4de0-9376-716a76b1faff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] No waiting events found dispatching network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.413 2 WARNING nova.compute.manager [req-6464fca3-3c74-4d0c-8e7b-82f54f11279a req-48b95120-e4e3-4de0-9376-716a76b1faff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received unexpected event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 for instance with vm_state active and task_state None.#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.596 2 DEBUG nova.compute.manager [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-changed-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.596 2 DEBUG nova.compute.manager [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Refreshing instance network info cache due to event network-changed-4e9e40d8-e1ed-437a-b254-ed4613949c85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.597 2 DEBUG oslo_concurrency.lockutils [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.597 2 DEBUG oslo_concurrency.lockutils [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquired lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  2 08:53:17 np0005466012 nova_compute[192063]: 2025-10-02 12:53:17.597 2 DEBUG nova.network.neutron [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Refreshing network info cache for port 4e9e40d8-e1ed-437a-b254-ed4613949c85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  2 08:53:18 np0005466012 nova_compute[192063]: 2025-10-02 12:53:18.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:19 np0005466012 podman[255019]: 2025-10-02 12:53:19.145186163 +0000 UTC m=+0.061620205 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001)
Oct  2 08:53:19 np0005466012 podman[255020]: 2025-10-02 12:53:19.15846561 +0000 UTC m=+0.067530598 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:53:19 np0005466012 podman[255021]: 2025-10-02 12:53:19.158737388 +0000 UTC m=+0.063536109 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:53:19 np0005466012 podman[255027]: 2025-10-02 12:53:19.22610389 +0000 UTC m=+0.125332137 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  2 08:53:19 np0005466012 nova_compute[192063]: 2025-10-02 12:53:19.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:21 np0005466012 nova_compute[192063]: 2025-10-02 12:53:21.811 2 DEBUG nova.network.neutron [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Updated VIF entry in instance network info cache for port 4e9e40d8-e1ed-437a-b254-ed4613949c85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  2 08:53:21 np0005466012 nova_compute[192063]: 2025-10-02 12:53:21.812 2 DEBUG nova.network.neutron [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Updating instance_info_cache with network_info: [{"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:53:21 np0005466012 nova_compute[192063]: 2025-10-02 12:53:21.837 2 DEBUG oslo_concurrency.lockutils [req-9fc2acaa-1e6c-4963-94e2-10c7b83a1140 req-1faf5495-30fb-4778-8747-5eaf0a22ed99 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Releasing lock "refresh_cache-05e968d6-dd2c-4863-88ee-98c84f0714a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  2 08:53:23 np0005466012 nova_compute[192063]: 2025-10-02 12:53:23.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:24 np0005466012 nova_compute[192063]: 2025-10-02 12:53:24.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:28 np0005466012 nova_compute[192063]: 2025-10-02 12:53:28.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:29 np0005466012 nova_compute[192063]: 2025-10-02 12:53:29.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:29Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e1:89:0d 10.100.0.3
Oct  2 08:53:29 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:29Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e1:89:0d 10.100.0.3
Oct  2 08:53:32 np0005466012 podman[255125]: 2025-10-02 12:53:32.15655792 +0000 UTC m=+0.058700975 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41)
Oct  2 08:53:32 np0005466012 podman[255124]: 2025-10-02 12:53:32.158580915 +0000 UTC m=+0.066404997 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:53:33 np0005466012 nova_compute[192063]: 2025-10-02 12:53:33.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:34 np0005466012 nova_compute[192063]: 2025-10-02 12:53:34.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:36 np0005466012 podman[255165]: 2025-10-02 12:53:36.144421503 +0000 UTC m=+0.052232326 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:53:36 np0005466012 podman[255164]: 2025-10-02 12:53:36.179487103 +0000 UTC m=+0.088916580 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:53:38 np0005466012 nova_compute[192063]: 2025-10-02 12:53:38.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:39 np0005466012 nova_compute[192063]: 2025-10-02 12:53:39.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:40 np0005466012 nova_compute[192063]: 2025-10-02 12:53:40.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:43 np0005466012 nova_compute[192063]: 2025-10-02 12:53:43.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:44 np0005466012 nova_compute[192063]: 2025-10-02 12:53:44.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:47 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:47Z|00761|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  2 08:53:48 np0005466012 nova_compute[192063]: 2025-10-02 12:53:48.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:49 np0005466012 nova_compute[192063]: 2025-10-02 12:53:49.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:50 np0005466012 podman[255209]: 2025-10-02 12:53:50.147456124 +0000 UTC m=+0.053632194 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:53:50 np0005466012 podman[255207]: 2025-10-02 12:53:50.154228671 +0000 UTC m=+0.069208635 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  2 08:53:50 np0005466012 podman[255215]: 2025-10-02 12:53:50.178916634 +0000 UTC m=+0.079692226 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 08:53:50 np0005466012 podman[255208]: 2025-10-02 12:53:50.185442774 +0000 UTC m=+0.085193757 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:50.603 103354 DEBUG eventlet.wsgi.server [-] (103354) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:50.605 103354 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: Accept: */*#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: Connection: close#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: Content-Type: text/plain#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: Host: 169.254.169.254#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: User-Agent: curl/7.84.0#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: X-Forwarded-For: 10.100.0.3#015
Oct  2 08:53:50 np0005466012 ovn_metadata_agent[103241]: X-Ovn-Network-Id: 453b9882-4bf3-448c-bd9c-f6befe8aa4a6 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  2 08:53:51 np0005466012 nova_compute[192063]: 2025-10-02 12:53:51.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:52.241 103354 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  2 08:53:52 np0005466012 haproxy-metadata-proxy-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255009]: 10.100.0.3:53076 [02/Oct/2025:12:53:50.602] listener listener/metadata 0/0/0/1640/1640 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:52.242 103354 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.6379495#033[00m
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:52.326 103354 DEBUG eventlet.wsgi.server [-] (103354) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:52.327 103354 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: Accept: */*#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: Connection: close#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: Content-Length: 100#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: Content-Type: application/x-www-form-urlencoded#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: Host: 169.254.169.254#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: User-Agent: curl/7.84.0#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: X-Forwarded-For: 10.100.0.3#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: X-Ovn-Network-Id: 453b9882-4bf3-448c-bd9c-f6befe8aa4a6#015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: #015
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  2 08:53:52 np0005466012 haproxy-metadata-proxy-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255009]: 10.100.0.3:53088 [02/Oct/2025:12:53:52.325] listener listener/metadata 0/0/0/488/488 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:52.814 103354 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  2 08:53:52 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:52.814 103354 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.4874306#033[00m
Oct  2 08:53:53 np0005466012 nova_compute[192063]: 2025-10-02 12:53:53.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:53 np0005466012 nova_compute[192063]: 2025-10-02 12:53:53.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.856 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.857 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.857 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.857 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.857 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.872 2 INFO nova.compute.manager [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Terminating instance#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.883 2 DEBUG nova.compute.manager [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  2 08:53:54 np0005466012 kernel: tap4e9e40d8-e1 (unregistering): left promiscuous mode
Oct  2 08:53:54 np0005466012 NetworkManager[51207]: <info>  [1759409634.9070] device (tap4e9e40d8-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  2 08:53:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:54Z|00762|binding|INFO|Releasing lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 from this chassis (sb_readonly=0)
Oct  2 08:53:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:54Z|00763|binding|INFO|Setting lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 down in Southbound
Oct  2 08:53:54 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:54Z|00764|binding|INFO|Removing iface tap4e9e40d8-e1 ovn-installed in OVS
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466012 nova_compute[192063]: 2025-10-02 12:53:54.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:54.965 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:89:0d 10.100.0.3'], port_security=['fa:16:3e:e1:89:0d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '449520c8b1da43f1b24a84c08baac03b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '723f044d-15a6-4f16-bc64-3542e5b28423 ea59696b-608e-4561-9861-f0c3334aea82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcf2980-98fd-43bf-91c9-1637efee9e0a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4e9e40d8-e1ed-437a-b254-ed4613949c85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:54.967 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4e9e40d8-e1ed-437a-b254-ed4613949c85 in datapath 453b9882-4bf3-448c-bd9c-f6befe8aa4a6 unbound from our chassis#033[00m
Oct  2 08:53:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:54.967 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 453b9882-4bf3-448c-bd9c-f6befe8aa4a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:53:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:54.969 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[79cc6085-2d97-4f89-8e25-0dc8a84241b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:54 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:54.969 103246 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6 namespace which is not needed anymore#033[00m
Oct  2 08:53:54 np0005466012 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Oct  2 08:53:54 np0005466012 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000bc.scope: Consumed 14.513s CPU time.
Oct  2 08:53:54 np0005466012 systemd-machined[152114]: Machine qemu-83-instance-000000bc terminated.
Oct  2 08:53:55 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [NOTICE]   (255007) : haproxy version is 2.8.14-c23fe91
Oct  2 08:53:55 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [NOTICE]   (255007) : path to executable is /usr/sbin/haproxy
Oct  2 08:53:55 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [WARNING]  (255007) : Exiting Master process...
Oct  2 08:53:55 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [ALERT]    (255007) : Current worker (255009) exited with code 143 (Terminated)
Oct  2 08:53:55 np0005466012 neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6[255003]: [WARNING]  (255007) : All workers exited. Exiting... (0)
Oct  2 08:53:55 np0005466012 systemd[1]: libpod-a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8.scope: Deactivated successfully.
Oct  2 08:53:55 np0005466012 podman[255319]: 2025-10-02 12:53:55.089010902 +0000 UTC m=+0.038503946 container died a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:53:55 np0005466012 kernel: tap4e9e40d8-e1: entered promiscuous mode
Oct  2 08:53:55 np0005466012 NetworkManager[51207]: <info>  [1759409635.1003] manager: (tap4e9e40d8-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/361)
Oct  2 08:53:55 np0005466012 kernel: tap4e9e40d8-e1 (unregistering): left promiscuous mode
Oct  2 08:53:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:55Z|00765|binding|INFO|Claiming lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 for this chassis.
Oct  2 08:53:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:55Z|00766|binding|INFO|4e9e40d8-e1ed-437a-b254-ed4613949c85: Claiming fa:16:3e:e1:89:0d 10.100.0.3
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:55Z|00767|binding|INFO|Setting lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 ovn-installed in OVS
Oct  2 08:53:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay-8c70a8ee4355f0a0807f2d8c4e9e98bb07ed538b6de0662290ece3d2b7966db9-merged.mount: Deactivated successfully.
Oct  2 08:53:55 np0005466012 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8-userdata-shm.mount: Deactivated successfully.
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.122 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:89:0d 10.100.0.3'], port_security=['fa:16:3e:e1:89:0d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '449520c8b1da43f1b24a84c08baac03b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '723f044d-15a6-4f16-bc64-3542e5b28423 ea59696b-608e-4561-9861-f0c3334aea82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcf2980-98fd-43bf-91c9-1637efee9e0a, chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4e9e40d8-e1ed-437a-b254-ed4613949c85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:55Z|00768|binding|INFO|Setting lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 up in Southbound
Oct  2 08:53:55 np0005466012 podman[255319]: 2025-10-02 12:53:55.129982164 +0000 UTC m=+0.079475208 container cleanup a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  2 08:53:55 np0005466012 systemd[1]: libpod-conmon-a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8.scope: Deactivated successfully.
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.154 2 INFO nova.virt.libvirt.driver [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Instance destroyed successfully.#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.155 2 DEBUG nova.objects.instance [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lazy-loading 'resources' on Instance uuid 05e968d6-dd2c-4863-88ee-98c84f0714a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  2 08:53:55 np0005466012 podman[255361]: 2025-10-02 12:53:55.193290935 +0000 UTC m=+0.036664765 container remove a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.193 2 DEBUG nova.virt.libvirt.vif [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T12:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1534615901',display_name='tempest-TestServerBasicOps-server-1534615901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1534615901',id=188,image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCaD7GQniYdoYwp35Nm6IfKAXjpBPBSCums2KeZ3fzXUrXjawEwxG/MjwuGDcNwXUee8EuobdYpMCam8FfxtemP8uY+pHbYNGzGD4zZl5C5madLQvYGt4xkcnsH6GF70dA==',key_name='tempest-TestServerBasicOps-349026518',keypairs=<?>,launch_index=0,launched_at=2025-10-02T12:53:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='449520c8b1da43f1b24a84c08baac03b',ramdisk_id='',reservation_id='r-e8sb2g07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cf60d86d-f1d5-4be4-976e-7488dbdcf0b2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1990510451',owner_user_name='tempest-TestServerBasicOps-1990510451-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T12:53:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25035a2aea694d7abea1da6e2dc97fd9',uuid=05e968d6-dd2c-4863-88ee-98c84f0714a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.194 2 DEBUG nova.network.os_vif_util [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Converting VIF {"id": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "address": "fa:16:3e:e1:89:0d", "network": {"id": "453b9882-4bf3-448c-bd9c-f6befe8aa4a6", "bridge": "br-int", "label": "tempest-TestServerBasicOps-2006130394-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449520c8b1da43f1b24a84c08baac03b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e9e40d8-e1", "ovs_interfaceid": "4e9e40d8-e1ed-437a-b254-ed4613949c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.194 2 DEBUG nova.network.os_vif_util [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.195 2 DEBUG os_vif [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.196 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e9e40d8-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.198 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[60624788-9ba2-411d-abf4-e3a9b2f28984]: (4, ('Thu Oct  2 12:53:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6 (a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8)\na80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8\nThu Oct  2 12:53:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6 (a80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8)\na80cc8758cafd794844623679807694a29ef8e90749ec8bdc70eaab3bcf212c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:55Z|00769|binding|INFO|Releasing lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 from this chassis (sb_readonly=0)
Oct  2 08:53:55 np0005466012 ovn_controller[94284]: 2025-10-02T12:53:55Z|00770|binding|INFO|Setting lport 4e9e40d8-e1ed-437a-b254-ed4613949c85 down in Southbound
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.200 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[45ccde57-b536-4f73-96fb-86e71b1b9795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.201 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap453b9882-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.201 2 INFO os_vif [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:89:0d,bridge_name='br-int',has_traffic_filtering=True,id=4e9e40d8-e1ed-437a-b254-ed4613949c85,network=Network(453b9882-4bf3-448c-bd9c-f6befe8aa4a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e9e40d8-e1')#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.202 2 INFO nova.virt.libvirt.driver [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Deleting instance files /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1_del#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.203 2 INFO nova.virt.libvirt.driver [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Deletion of /var/lib/nova/instances/05e968d6-dd2c-4863-88ee-98c84f0714a1_del complete#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.212 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:89:0d 10.100.0.3'], port_security=['fa:16:3e:e1:89:0d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '05e968d6-dd2c-4863-88ee-98c84f0714a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '449520c8b1da43f1b24a84c08baac03b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '723f044d-15a6-4f16-bc64-3542e5b28423 ea59696b-608e-4561-9861-f0c3334aea82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcf2980-98fd-43bf-91c9-1637efee9e0a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>], logical_port=4e9e40d8-e1ed-437a-b254-ed4613949c85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f241224dd90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 kernel: tap453b9882-40: left promiscuous mode
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.226 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[888061e3-73a3-4987-9d04-6000a411f17c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.251 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[c01e7758-65ba-4934-8569-6e4301545388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.252 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[fd34718a-7cf7-4001-84ac-ad9bdb479f8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.268 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[289b73db-84fb-4c64-bfbe-d153aaf4f2ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758862, 'reachable_time': 39564, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255379, 'error': None, 'target': 'ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.270 103359 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-453b9882-4bf3-448c-bd9c-f6befe8aa4a6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.271 103359 DEBUG oslo.privsep.daemon [-] privsep: reply[ca46532f-4fa5-4677-a8e8-c221f28da68b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.271 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4e9e40d8-e1ed-437a-b254-ed4613949c85 in datapath 453b9882-4bf3-448c-bd9c-f6befe8aa4a6 unbound from our chassis#033[00m
Oct  2 08:53:55 np0005466012 systemd[1]: run-netns-ovnmeta\x2d453b9882\x2d4bf3\x2d448c\x2dbd9c\x2df6befe8aa4a6.mount: Deactivated successfully.
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.272 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 453b9882-4bf3-448c-bd9c-f6befe8aa4a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.273 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[a41a4096-fb63-489b-be14-f0878665b2bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.273 103246 INFO neutron.agent.ovn.metadata.agent [-] Port 4e9e40d8-e1ed-437a-b254-ed4613949c85 in datapath 453b9882-4bf3-448c-bd9c-f6befe8aa4a6 unbound from our chassis#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.274 103246 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 453b9882-4bf3-448c-bd9c-f6befe8aa4a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  2 08:53:55 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:55.274 219792 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab05300-5c15-4245-9f3b-5022fb65a9f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.305 2 INFO nova.compute.manager [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.305 2 DEBUG oslo.service.loopingcall [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.306 2 DEBUG nova.compute.manager [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.306 2 DEBUG nova.network.neutron [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.983 2 DEBUG nova.compute.manager [req-8f45c19a-59d5-4c45-b27e-8b9ed3459317 req-da9d99b2-b521-499e-ac72-9ed7fdb52e13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-unplugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.984 2 DEBUG oslo_concurrency.lockutils [req-8f45c19a-59d5-4c45-b27e-8b9ed3459317 req-da9d99b2-b521-499e-ac72-9ed7fdb52e13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.984 2 DEBUG oslo_concurrency.lockutils [req-8f45c19a-59d5-4c45-b27e-8b9ed3459317 req-da9d99b2-b521-499e-ac72-9ed7fdb52e13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.984 2 DEBUG oslo_concurrency.lockutils [req-8f45c19a-59d5-4c45-b27e-8b9ed3459317 req-da9d99b2-b521-499e-ac72-9ed7fdb52e13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.985 2 DEBUG nova.compute.manager [req-8f45c19a-59d5-4c45-b27e-8b9ed3459317 req-da9d99b2-b521-499e-ac72-9ed7fdb52e13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] No waiting events found dispatching network-vif-unplugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:53:55 np0005466012 nova_compute[192063]: 2025-10-02 12:53:55.985 2 DEBUG nova.compute.manager [req-8f45c19a-59d5-4c45-b27e-8b9ed3459317 req-da9d99b2-b521-499e-ac72-9ed7fdb52e13 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-unplugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  2 08:53:56 np0005466012 nova_compute[192063]: 2025-10-02 12:53:56.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:53:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:56.074 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 08:53:56 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:53:56.076 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 08:53:57 np0005466012 nova_compute[192063]: 2025-10-02 12:53:57.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.099 2 DEBUG nova.network.neutron [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.137 2 DEBUG nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.138 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.138 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.138 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.139 2 DEBUG nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] No waiting events found dispatching network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.139 2 WARNING nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received unexpected event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.139 2 DEBUG nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.139 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.139 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.140 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.140 2 DEBUG nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] No waiting events found dispatching network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.140 2 WARNING nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received unexpected event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.140 2 DEBUG nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.140 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.141 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.141 2 DEBUG oslo_concurrency.lockutils [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.141 2 DEBUG nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] No waiting events found dispatching network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.141 2 WARNING nova.compute.manager [req-3ad98244-2bc7-4d71-8f75-fc8939f2e8ec req-582dc936-5a88-4960-b2d3-d81727ba60ff 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received unexpected event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 for instance with vm_state active and task_state deleting.#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.142 2 INFO nova.compute.manager [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Took 2.84 seconds to deallocate network for instance.#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.220 2 DEBUG nova.compute.manager [req-612ba378-79d9-44da-af1f-8937c157f35e req-fcb10e2d-7536-4909-9e0a-7f08fcdbc90f 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-deleted-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.248 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.249 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.343 2 DEBUG nova.compute.provider_tree [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.358 2 DEBUG nova.scheduler.client.report [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.440 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.467 2 INFO nova.scheduler.client.report [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Deleted allocations for instance 05e968d6-dd2c-4863-88ee-98c84f0714a1#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.626 2 DEBUG oslo_concurrency.lockutils [None req-a16eafff-bf20-4893-80db-198cb9a33b7c 25035a2aea694d7abea1da6e2dc97fd9 449520c8b1da43f1b24a84c08baac03b - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.848 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.849 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.988 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.989 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5723MB free_disk=73.22919464111328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.989 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:53:58 np0005466012 nova_compute[192063]: 2025-10-02 12:53:58.989 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.080 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.080 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.121 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.150 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.183 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.183 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:53:59 np0005466012 nova_compute[192063]: 2025-10-02 12:53:59.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.316 2 DEBUG nova.compute.manager [req-9923e0f2-df87-4853-b236-2d3f4f96b1a7 req-0c008670-7679-4f0d-bcf1-2eb96682d396 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.317 2 DEBUG oslo_concurrency.lockutils [req-9923e0f2-df87-4853-b236-2d3f4f96b1a7 req-0c008670-7679-4f0d-bcf1-2eb96682d396 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Acquiring lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.317 2 DEBUG oslo_concurrency.lockutils [req-9923e0f2-df87-4853-b236-2d3f4f96b1a7 req-0c008670-7679-4f0d-bcf1-2eb96682d396 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.317 2 DEBUG oslo_concurrency.lockutils [req-9923e0f2-df87-4853-b236-2d3f4f96b1a7 req-0c008670-7679-4f0d-bcf1-2eb96682d396 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] Lock "05e968d6-dd2c-4863-88ee-98c84f0714a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.317 2 DEBUG nova.compute.manager [req-9923e0f2-df87-4853-b236-2d3f4f96b1a7 req-0c008670-7679-4f0d-bcf1-2eb96682d396 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] No waiting events found dispatching network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  2 08:54:00 np0005466012 nova_compute[192063]: 2025-10-02 12:54:00.317 2 WARNING nova.compute.manager [req-9923e0f2-df87-4853-b236-2d3f4f96b1a7 req-0c008670-7679-4f0d-bcf1-2eb96682d396 0d72f6c58d6b4ef89c0d1d75b420b96f bffbc2e8eeb448dcbb34a8b5bc72922e - - default default] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Received unexpected event network-vif-plugged-4e9e40d8-e1ed-437a-b254-ed4613949c85 for instance with vm_state deleted and task_state None.#033[00m
Oct  2 08:54:01 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:54:01.078 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 08:54:01 np0005466012 nova_compute[192063]: 2025-10-02 12:54:01.184 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:01 np0005466012 nova_compute[192063]: 2025-10-02 12:54:01.184 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:54:02.171 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:54:02.172 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:54:02.172 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:02 np0005466012 nova_compute[192063]: 2025-10-02 12:54:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:03 np0005466012 podman[255382]: 2025-10-02 12:54:03.132979145 +0000 UTC m=+0.052988816 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:54:03 np0005466012 podman[255381]: 2025-10-02 12:54:03.138472717 +0000 UTC m=+0.058459968 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:54:03 np0005466012 nova_compute[192063]: 2025-10-02 12:54:03.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:03 np0005466012 nova_compute[192063]: 2025-10-02 12:54:03.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:54:03 np0005466012 nova_compute[192063]: 2025-10-02 12:54:03.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:54:03 np0005466012 nova_compute[192063]: 2025-10-02 12:54:03.844 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:54:04 np0005466012 nova_compute[192063]: 2025-10-02 12:54:04.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:05 np0005466012 nova_compute[192063]: 2025-10-02 12:54:05.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:06 np0005466012 nova_compute[192063]: 2025-10-02 12:54:06.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:06 np0005466012 nova_compute[192063]: 2025-10-02 12:54:06.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:07 np0005466012 podman[255421]: 2025-10-02 12:54:07.13572981 +0000 UTC m=+0.055787834 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:54:07 np0005466012 podman[255422]: 2025-10-02 12:54:07.136282355 +0000 UTC m=+0.055144736 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:54:07 np0005466012 nova_compute[192063]: 2025-10-02 12:54:07.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:07 np0005466012 nova_compute[192063]: 2025-10-02 12:54:07.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:54:09 np0005466012 nova_compute[192063]: 2025-10-02 12:54:09.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:10 np0005466012 nova_compute[192063]: 2025-10-02 12:54:10.152 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759409635.1511276, 05e968d6-dd2c-4863-88ee-98c84f0714a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  2 08:54:10 np0005466012 nova_compute[192063]: 2025-10-02 12:54:10.152 2 INFO nova.compute.manager [-] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] VM Stopped (Lifecycle Event)#033[00m
Oct  2 08:54:10 np0005466012 nova_compute[192063]: 2025-10-02 12:54:10.189 2 DEBUG nova.compute.manager [None req-31663f67-30f2-423d-a18f-4054b60de9f5 - - - - - -] [instance: 05e968d6-dd2c-4863-88ee-98c84f0714a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  2 08:54:10 np0005466012 nova_compute[192063]: 2025-10-02 12:54:10.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:14 np0005466012 nova_compute[192063]: 2025-10-02 12:54:14.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:15 np0005466012 nova_compute[192063]: 2025-10-02 12:54:15.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:19 np0005466012 nova_compute[192063]: 2025-10-02 12:54:19.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:20 np0005466012 nova_compute[192063]: 2025-10-02 12:54:20.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:21 np0005466012 podman[255464]: 2025-10-02 12:54:21.146843604 +0000 UTC m=+0.053499500 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:54:21 np0005466012 podman[255460]: 2025-10-02 12:54:21.148595293 +0000 UTC m=+0.065767589 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:54:21 np0005466012 podman[255461]: 2025-10-02 12:54:21.173613195 +0000 UTC m=+0.085622289 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  2 08:54:21 np0005466012 podman[255470]: 2025-10-02 12:54:21.204012026 +0000 UTC m=+0.104436279 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 08:54:24 np0005466012 nova_compute[192063]: 2025-10-02 12:54:24.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:25 np0005466012 nova_compute[192063]: 2025-10-02 12:54:25.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:29 np0005466012 nova_compute[192063]: 2025-10-02 12:54:29.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:30 np0005466012 nova_compute[192063]: 2025-10-02 12:54:30.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:34 np0005466012 podman[255552]: 2025-10-02 12:54:34.166736297 +0000 UTC m=+0.066247603 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  2 08:54:34 np0005466012 podman[255551]: 2025-10-02 12:54:34.182481643 +0000 UTC m=+0.088000045 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 08:54:34 np0005466012 nova_compute[192063]: 2025-10-02 12:54:34.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:35 np0005466012 nova_compute[192063]: 2025-10-02 12:54:35.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:38 np0005466012 podman[255591]: 2025-10-02 12:54:38.138865035 +0000 UTC m=+0.056860783 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:54:38 np0005466012 podman[255590]: 2025-10-02 12:54:38.139613945 +0000 UTC m=+0.060284117 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  2 08:54:39 np0005466012 nova_compute[192063]: 2025-10-02 12:54:39.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:40 np0005466012 nova_compute[192063]: 2025-10-02 12:54:40.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:40 np0005466012 nova_compute[192063]: 2025-10-02 12:54:40.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:44 np0005466012 ovn_controller[94284]: 2025-10-02T12:54:44Z|00771|memory_trim|INFO|Detected inactivity (last active 30023 ms ago): trimming memory
Oct  2 08:54:44 np0005466012 nova_compute[192063]: 2025-10-02 12:54:44.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:45 np0005466012 nova_compute[192063]: 2025-10-02 12:54:45.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:49 np0005466012 nova_compute[192063]: 2025-10-02 12:54:49.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:50 np0005466012 nova_compute[192063]: 2025-10-02 12:54:50.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:52 np0005466012 podman[255636]: 2025-10-02 12:54:52.144368165 +0000 UTC m=+0.054685613 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:54:52 np0005466012 podman[255634]: 2025-10-02 12:54:52.15252246 +0000 UTC m=+0.067325403 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  2 08:54:52 np0005466012 podman[255637]: 2025-10-02 12:54:52.175142576 +0000 UTC m=+0.080509137 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 08:54:52 np0005466012 podman[255635]: 2025-10-02 12:54:52.175232768 +0000 UTC m=+0.087026718 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 08:54:52 np0005466012 nova_compute[192063]: 2025-10-02 12:54:52.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:54 np0005466012 nova_compute[192063]: 2025-10-02 12:54:54.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:55 np0005466012 nova_compute[192063]: 2025-10-02 12:54:55.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:55 np0005466012 nova_compute[192063]: 2025-10-02 12:54:55.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:57 np0005466012 nova_compute[192063]: 2025-10-02 12:54:57.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:57 np0005466012 nova_compute[192063]: 2025-10-02 12:54:57.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:57 np0005466012 nova_compute[192063]: 2025-10-02 12:54:57.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 08:54:58 np0005466012 nova_compute[192063]: 2025-10-02 12:54:58.907 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:54:58 np0005466012 nova_compute[192063]: 2025-10-02 12:54:58.994 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:58 np0005466012 nova_compute[192063]: 2025-10-02 12:54:58.994 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:58 np0005466012 nova_compute[192063]: 2025-10-02 12:54:58.994 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:54:58 np0005466012 nova_compute[192063]: 2025-10-02 12:54:58.994 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.142 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.143 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5727MB free_disk=73.229248046875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.144 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.144 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.455 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.456 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.497 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.513 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.514 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:54:59 np0005466012 nova_compute[192063]: 2025-10-02 12:54:59.515 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:00 np0005466012 nova_compute[192063]: 2025-10-02 12:55:00.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:01 np0005466012 nova_compute[192063]: 2025-10-02 12:55:01.429 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:55:02.172 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:55:02.173 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:55:02.173 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:02 np0005466012 nova_compute[192063]: 2025-10-02 12:55:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:04 np0005466012 nova_compute[192063]: 2025-10-02 12:55:04.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:04 np0005466012 nova_compute[192063]: 2025-10-02 12:55:04.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:04 np0005466012 nova_compute[192063]: 2025-10-02 12:55:04.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:55:04 np0005466012 nova_compute[192063]: 2025-10-02 12:55:04.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:55:04 np0005466012 nova_compute[192063]: 2025-10-02 12:55:04.861 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:55:05 np0005466012 podman[255713]: 2025-10-02 12:55:05.130943715 +0000 UTC m=+0.049112608 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:55:05 np0005466012 podman[255714]: 2025-10-02 12:55:05.136585791 +0000 UTC m=+0.051351780 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Oct  2 08:55:05 np0005466012 nova_compute[192063]: 2025-10-02 12:55:05.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:08 np0005466012 nova_compute[192063]: 2025-10-02 12:55:08.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:08 np0005466012 nova_compute[192063]: 2025-10-02 12:55:08.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:55:09 np0005466012 podman[255753]: 2025-10-02 12:55:09.151427489 +0000 UTC m=+0.059131907 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:55:09 np0005466012 podman[255752]: 2025-10-02 12:55:09.171154124 +0000 UTC m=+0.085671640 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  2 08:55:09 np0005466012 nova_compute[192063]: 2025-10-02 12:55:09.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:10 np0005466012 nova_compute[192063]: 2025-10-02 12:55:10.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:14 np0005466012 nova_compute[192063]: 2025-10-02 12:55:14.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:15 np0005466012 nova_compute[192063]: 2025-10-02 12:55:15.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:16 np0005466012 nova_compute[192063]: 2025-10-02 12:55:16.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:55:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:55:19 np0005466012 nova_compute[192063]: 2025-10-02 12:55:19.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:19 np0005466012 systemd-logind[827]: New session 46 of user zuul.
Oct  2 08:55:19 np0005466012 systemd[1]: Started Session 46 of User zuul.
Oct  2 08:55:20 np0005466012 nova_compute[192063]: 2025-10-02 12:55:20.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:22 np0005466012 nova_compute[192063]: 2025-10-02 12:55:22.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:22 np0005466012 nova_compute[192063]: 2025-10-02 12:55:22.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 08:55:22 np0005466012 nova_compute[192063]: 2025-10-02 12:55:22.863 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 08:55:23 np0005466012 podman[255940]: 2025-10-02 12:55:23.153554378 +0000 UTC m=+0.057568893 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 08:55:23 np0005466012 podman[255939]: 2025-10-02 12:55:23.160026287 +0000 UTC m=+0.068923177 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:55:23 np0005466012 podman[255942]: 2025-10-02 12:55:23.187817426 +0000 UTC m=+0.084479747 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:55:23 np0005466012 podman[255941]: 2025-10-02 12:55:23.18832847 +0000 UTC m=+0.085601338 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:55:24 np0005466012 ovs-vsctl[256052]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  2 08:55:24 np0005466012 nova_compute[192063]: 2025-10-02 12:55:24.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:24 np0005466012 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 255822 (sos)
Oct  2 08:55:24 np0005466012 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  2 08:55:24 np0005466012 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  2 08:55:25 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  2 08:55:25 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  2 08:55:25 np0005466012 nova_compute[192063]: 2025-10-02 12:55:25.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:25 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 08:55:26 np0005466012 kernel: block vda: the capability attribute has been deprecated.
Oct  2 08:55:28 np0005466012 systemd[1]: Starting Hostname Service...
Oct  2 08:55:28 np0005466012 systemd[1]: Started Hostname Service.
Oct  2 08:55:29 np0005466012 nova_compute[192063]: 2025-10-02 12:55:29.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:30 np0005466012 nova_compute[192063]: 2025-10-02 12:55:30.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:32 np0005466012 nova_compute[192063]: 2025-10-02 12:55:32.409 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:34 np0005466012 nova_compute[192063]: 2025-10-02 12:55:34.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:34 np0005466012 ovs-appctl[257498]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 08:55:34 np0005466012 ovs-appctl[257502]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 08:55:34 np0005466012 ovs-appctl[257506]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  2 08:55:35 np0005466012 nova_compute[192063]: 2025-10-02 12:55:35.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:36 np0005466012 podman[257747]: 2025-10-02 12:55:36.148990715 +0000 UTC m=+0.064889347 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:55:36 np0005466012 podman[257751]: 2025-10-02 12:55:36.153608432 +0000 UTC m=+0.069823222 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Oct  2 08:55:37 np0005466012 nova_compute[192063]: 2025-10-02 12:55:37.390 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:39 np0005466012 podman[206348]: time="2025-10-02T12:55:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  2 08:55:39 np0005466012 podman[206348]: @ - - [02/Oct/2025:12:55:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 25331 "" "Go-http-client/1.1"
Oct  2 08:55:39 np0005466012 nova_compute[192063]: 2025-10-02 12:55:39.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:40 np0005466012 podman[258759]: 2025-10-02 12:55:40.145660801 +0000 UTC m=+0.058249242 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:55:40 np0005466012 podman[258758]: 2025-10-02 12:55:40.14959736 +0000 UTC m=+0.064789692 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 08:55:40 np0005466012 nova_compute[192063]: 2025-10-02 12:55:40.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:41 np0005466012 nova_compute[192063]: 2025-10-02 12:55:41.825 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:42 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 08:55:44 np0005466012 nova_compute[192063]: 2025-10-02 12:55:44.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:44 np0005466012 systemd[1]: Starting Time & Date Service...
Oct  2 08:55:44 np0005466012 systemd[1]: Started Time & Date Service.
Oct  2 08:55:45 np0005466012 nova_compute[192063]: 2025-10-02 12:55:45.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:49 np0005466012 nova_compute[192063]: 2025-10-02 12:55:49.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:50 np0005466012 nova_compute[192063]: 2025-10-02 12:55:50.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:52 np0005466012 nova_compute[192063]: 2025-10-02 12:55:52.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:53 np0005466012 podman[259295]: 2025-10-02 12:55:53.41311776 +0000 UTC m=+0.053439729 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  2 08:55:53 np0005466012 podman[259296]: 2025-10-02 12:55:53.417476661 +0000 UTC m=+0.055366033 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:55:53 np0005466012 podman[259294]: 2025-10-02 12:55:53.421374928 +0000 UTC m=+0.062195981 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  2 08:55:53 np0005466012 podman[259297]: 2025-10-02 12:55:53.441925937 +0000 UTC m=+0.076698393 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  2 08:55:53 np0005466012 nova_compute[192063]: 2025-10-02 12:55:53.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:54 np0005466012 nova_compute[192063]: 2025-10-02 12:55:54.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:55 np0005466012 nova_compute[192063]: 2025-10-02 12:55:55.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:55:55 np0005466012 nova_compute[192063]: 2025-10-02 12:55:55.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:58 np0005466012 nova_compute[192063]: 2025-10-02 12:55:58.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:58 np0005466012 nova_compute[192063]: 2025-10-02 12:55:58.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:55:58 np0005466012 nova_compute[192063]: 2025-10-02 12:55:58.921 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:58 np0005466012 nova_compute[192063]: 2025-10-02 12:55:58.921 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:58 np0005466012 nova_compute[192063]: 2025-10-02 12:55:58.921 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:58 np0005466012 nova_compute[192063]: 2025-10-02 12:55:58.922 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.074 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.075 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5494MB free_disk=72.86749649047852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.075 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.076 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.165 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.165 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.201 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.218 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.243 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.244 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:55:59 np0005466012 nova_compute[192063]: 2025-10-02 12:55:59.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:00 np0005466012 nova_compute[192063]: 2025-10-02 12:56:00.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:56:02.174 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:56:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:56:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:56:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:56:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:56:03 np0005466012 nova_compute[192063]: 2025-10-02 12:56:03.245 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:03 np0005466012 nova_compute[192063]: 2025-10-02 12:56:03.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:04 np0005466012 nova_compute[192063]: 2025-10-02 12:56:04.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:04 np0005466012 nova_compute[192063]: 2025-10-02 12:56:04.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:04 np0005466012 nova_compute[192063]: 2025-10-02 12:56:04.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:56:04 np0005466012 nova_compute[192063]: 2025-10-02 12:56:04.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:56:05 np0005466012 nova_compute[192063]: 2025-10-02 12:56:05.178 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:56:05 np0005466012 nova_compute[192063]: 2025-10-02 12:56:05.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:07 np0005466012 podman[259375]: 2025-10-02 12:56:07.13915319 +0000 UTC m=+0.052212895 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_id=edpm, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Oct  2 08:56:07 np0005466012 podman[259374]: 2025-10-02 12:56:07.169271514 +0000 UTC m=+0.085144267 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  2 08:56:09 np0005466012 nova_compute[192063]: 2025-10-02 12:56:09.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:09 np0005466012 systemd[1]: session-46.scope: Deactivated successfully.
Oct  2 08:56:09 np0005466012 systemd[1]: session-46.scope: Consumed 1min 20.203s CPU time, 657.1M memory peak, read 165.8M from disk, written 24.9M to disk.
Oct  2 08:56:09 np0005466012 systemd-logind[827]: Session 46 logged out. Waiting for processes to exit.
Oct  2 08:56:09 np0005466012 systemd-logind[827]: Removed session 46.
Oct  2 08:56:09 np0005466012 systemd-logind[827]: New session 47 of user zuul.
Oct  2 08:56:09 np0005466012 systemd[1]: Started Session 47 of User zuul.
Oct  2 08:56:09 np0005466012 nova_compute[192063]: 2025-10-02 12:56:09.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:09 np0005466012 nova_compute[192063]: 2025-10-02 12:56:09.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:56:10 np0005466012 systemd[1]: session-47.scope: Deactivated successfully.
Oct  2 08:56:10 np0005466012 systemd-logind[827]: Session 47 logged out. Waiting for processes to exit.
Oct  2 08:56:10 np0005466012 systemd-logind[827]: Removed session 47.
Oct  2 08:56:10 np0005466012 systemd-logind[827]: New session 48 of user zuul.
Oct  2 08:56:10 np0005466012 systemd[1]: Started Session 48 of User zuul.
Oct  2 08:56:10 np0005466012 systemd[1]: session-48.scope: Deactivated successfully.
Oct  2 08:56:10 np0005466012 systemd-logind[827]: Session 48 logged out. Waiting for processes to exit.
Oct  2 08:56:10 np0005466012 systemd-logind[827]: Removed session 48.
Oct  2 08:56:10 np0005466012 podman[259471]: 2025-10-02 12:56:10.320925762 +0000 UTC m=+0.052547884 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:56:10 np0005466012 podman[259470]: 2025-10-02 12:56:10.321306802 +0000 UTC m=+0.054102046 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:56:10 np0005466012 nova_compute[192063]: 2025-10-02 12:56:10.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:14 np0005466012 nova_compute[192063]: 2025-10-02 12:56:14.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:14 np0005466012 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  2 08:56:14 np0005466012 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  2 08:56:15 np0005466012 nova_compute[192063]: 2025-10-02 12:56:15.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:19 np0005466012 nova_compute[192063]: 2025-10-02 12:56:19.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:20 np0005466012 nova_compute[192063]: 2025-10-02 12:56:20.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:24 np0005466012 podman[259520]: 2025-10-02 12:56:24.135033358 +0000 UTC m=+0.047598847 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  2 08:56:24 np0005466012 podman[259518]: 2025-10-02 12:56:24.150403903 +0000 UTC m=+0.067399085 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 08:56:24 np0005466012 podman[259519]: 2025-10-02 12:56:24.160528503 +0000 UTC m=+0.065983375 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:56:24 np0005466012 podman[259521]: 2025-10-02 12:56:24.190518683 +0000 UTC m=+0.094377452 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Oct  2 08:56:24 np0005466012 nova_compute[192063]: 2025-10-02 12:56:24.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:25 np0005466012 nova_compute[192063]: 2025-10-02 12:56:25.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:29 np0005466012 nova_compute[192063]: 2025-10-02 12:56:29.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:30 np0005466012 nova_compute[192063]: 2025-10-02 12:56:30.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:34 np0005466012 nova_compute[192063]: 2025-10-02 12:56:34.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:35 np0005466012 nova_compute[192063]: 2025-10-02 12:56:35.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:38 np0005466012 podman[259605]: 2025-10-02 12:56:38.139896648 +0000 UTC m=+0.057876111 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Oct  2 08:56:38 np0005466012 podman[259604]: 2025-10-02 12:56:38.149302059 +0000 UTC m=+0.070313605 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 08:56:39 np0005466012 nova_compute[192063]: 2025-10-02 12:56:39.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:40 np0005466012 nova_compute[192063]: 2025-10-02 12:56:40.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:41 np0005466012 podman[259642]: 2025-10-02 12:56:41.134114784 +0000 UTC m=+0.052792801 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:56:41 np0005466012 podman[259641]: 2025-10-02 12:56:41.140523601 +0000 UTC m=+0.059247529 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 08:56:41 np0005466012 nova_compute[192063]: 2025-10-02 12:56:41.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:44 np0005466012 nova_compute[192063]: 2025-10-02 12:56:44.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:45 np0005466012 nova_compute[192063]: 2025-10-02 12:56:45.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:49 np0005466012 nova_compute[192063]: 2025-10-02 12:56:49.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:50 np0005466012 nova_compute[192063]: 2025-10-02 12:56:50.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:54 np0005466012 nova_compute[192063]: 2025-10-02 12:56:54.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:54 np0005466012 nova_compute[192063]: 2025-10-02 12:56:54.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:55 np0005466012 podman[259690]: 2025-10-02 12:56:55.143568991 +0000 UTC m=+0.048679876 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:56:55 np0005466012 podman[259688]: 2025-10-02 12:56:55.143559771 +0000 UTC m=+0.055602998 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 08:56:55 np0005466012 podman[259689]: 2025-10-02 12:56:55.164319774 +0000 UTC m=+0.073164242 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:56:55 np0005466012 podman[259697]: 2025-10-02 12:56:55.196520684 +0000 UTC m=+0.088199038 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:56:55 np0005466012 nova_compute[192063]: 2025-10-02 12:56:55.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:55 np0005466012 nova_compute[192063]: 2025-10-02 12:56:55.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:56:59 np0005466012 nova_compute[192063]: 2025-10-02 12:56:59.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:56:59 np0005466012 nova_compute[192063]: 2025-10-02 12:56:59.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:00 np0005466012 nova_compute[192063]: 2025-10-02 12:57:00.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:00 np0005466012 nova_compute[192063]: 2025-10-02 12:57:00.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:00 np0005466012 nova_compute[192063]: 2025-10-02 12:57:00.889 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:00 np0005466012 nova_compute[192063]: 2025-10-02 12:57:00.889 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:00 np0005466012 nova_compute[192063]: 2025-10-02 12:57:00.890 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:00 np0005466012 nova_compute[192063]: 2025-10-02 12:57:00.890 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.025 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.025 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5694MB free_disk=73.22900009155273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.026 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.026 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.120 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.120 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.158 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.207 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.207 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.242 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.285 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.347 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.361 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.385 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:57:01 np0005466012 nova_compute[192063]: 2025-10-02 12:57:01.385 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:57:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:57:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:57:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:57:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:57:02.175 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:57:04 np0005466012 nova_compute[192063]: 2025-10-02 12:57:04.388 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:04 np0005466012 nova_compute[192063]: 2025-10-02 12:57:04.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:04 np0005466012 nova_compute[192063]: 2025-10-02 12:57:04.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:05 np0005466012 nova_compute[192063]: 2025-10-02 12:57:05.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:06 np0005466012 nova_compute[192063]: 2025-10-02 12:57:06.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:06 np0005466012 nova_compute[192063]: 2025-10-02 12:57:06.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:57:06 np0005466012 nova_compute[192063]: 2025-10-02 12:57:06.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:57:06 np0005466012 nova_compute[192063]: 2025-10-02 12:57:06.856 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:57:09 np0005466012 podman[259772]: 2025-10-02 12:57:09.13894928 +0000 UTC m=+0.060269677 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:57:09 np0005466012 podman[259773]: 2025-10-02 12:57:09.148707439 +0000 UTC m=+0.049893189 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1755695350)
Oct  2 08:57:09 np0005466012 nova_compute[192063]: 2025-10-02 12:57:09.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:09 np0005466012 nova_compute[192063]: 2025-10-02 12:57:09.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:09 np0005466012 nova_compute[192063]: 2025-10-02 12:57:09.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:57:10 np0005466012 nova_compute[192063]: 2025-10-02 12:57:10.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:12 np0005466012 podman[259813]: 2025-10-02 12:57:12.133371862 +0000 UTC m=+0.049332234 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 08:57:12 np0005466012 podman[259814]: 2025-10-02 12:57:12.135348866 +0000 UTC m=+0.047432771 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 08:57:14 np0005466012 nova_compute[192063]: 2025-10-02 12:57:14.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:15 np0005466012 nova_compute[192063]: 2025-10-02 12:57:15.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:57:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:57:19 np0005466012 nova_compute[192063]: 2025-10-02 12:57:19.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:19 np0005466012 nova_compute[192063]: 2025-10-02 12:57:19.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:20 np0005466012 nova_compute[192063]: 2025-10-02 12:57:20.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:24 np0005466012 nova_compute[192063]: 2025-10-02 12:57:24.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:25 np0005466012 nova_compute[192063]: 2025-10-02 12:57:25.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:26 np0005466012 podman[259859]: 2025-10-02 12:57:26.150426567 +0000 UTC m=+0.057308874 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct  2 08:57:26 np0005466012 podman[259860]: 2025-10-02 12:57:26.157384079 +0000 UTC m=+0.056000588 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:57:26 np0005466012 podman[259861]: 2025-10-02 12:57:26.184567701 +0000 UTC m=+0.078759478 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller)
Oct  2 08:57:26 np0005466012 podman[259858]: 2025-10-02 12:57:26.18562013 +0000 UTC m=+0.092291982 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  2 08:57:29 np0005466012 nova_compute[192063]: 2025-10-02 12:57:29.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:30 np0005466012 nova_compute[192063]: 2025-10-02 12:57:30.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:34 np0005466012 nova_compute[192063]: 2025-10-02 12:57:34.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:35 np0005466012 nova_compute[192063]: 2025-10-02 12:57:35.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:39 np0005466012 nova_compute[192063]: 2025-10-02 12:57:39.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:40 np0005466012 podman[259941]: 2025-10-02 12:57:40.154804273 +0000 UTC m=+0.059682569 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Oct  2 08:57:40 np0005466012 podman[259940]: 2025-10-02 12:57:40.158402773 +0000 UTC m=+0.070370545 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 08:57:40 np0005466012 nova_compute[192063]: 2025-10-02 12:57:40.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:42 np0005466012 nova_compute[192063]: 2025-10-02 12:57:42.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:43 np0005466012 podman[259982]: 2025-10-02 12:57:43.140556697 +0000 UTC m=+0.053392187 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 08:57:43 np0005466012 podman[259981]: 2025-10-02 12:57:43.157483544 +0000 UTC m=+0.077717218 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  2 08:57:44 np0005466012 nova_compute[192063]: 2025-10-02 12:57:44.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:45 np0005466012 nova_compute[192063]: 2025-10-02 12:57:45.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:49 np0005466012 nova_compute[192063]: 2025-10-02 12:57:49.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:50 np0005466012 nova_compute[192063]: 2025-10-02 12:57:50.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:54 np0005466012 nova_compute[192063]: 2025-10-02 12:57:54.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:54 np0005466012 nova_compute[192063]: 2025-10-02 12:57:54.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:55 np0005466012 nova_compute[192063]: 2025-10-02 12:57:55.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:57:57 np0005466012 podman[260028]: 2025-10-02 12:57:57.135539294 +0000 UTC m=+0.051471013 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 08:57:57 np0005466012 podman[260027]: 2025-10-02 12:57:57.140417409 +0000 UTC m=+0.056298366 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:57:57 np0005466012 podman[260029]: 2025-10-02 12:57:57.161207493 +0000 UTC m=+0.072924716 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 08:57:57 np0005466012 podman[260030]: 2025-10-02 12:57:57.166587402 +0000 UTC m=+0.074801148 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  2 08:57:57 np0005466012 nova_compute[192063]: 2025-10-02 12:57:57.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:57:59 np0005466012 nova_compute[192063]: 2025-10-02 12:57:59.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:00 np0005466012 nova_compute[192063]: 2025-10-02 12:58:00.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:00 np0005466012 nova_compute[192063]: 2025-10-02 12:58:00.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:01 np0005466012 nova_compute[192063]: 2025-10-02 12:58:01.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:01 np0005466012 nova_compute[192063]: 2025-10-02 12:58:01.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:01 np0005466012 nova_compute[192063]: 2025-10-02 12:58:01.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:01 np0005466012 nova_compute[192063]: 2025-10-02 12:58:01.868 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:01 np0005466012 nova_compute[192063]: 2025-10-02 12:58:01.868 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.017 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.018 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5696MB free_disk=73.22954940795898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.018 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.018 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:58:02.176 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:58:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:58:02.176 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:58:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:58:02.176 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.287 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.288 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.464 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.483 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.485 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:58:02 np0005466012 nova_compute[192063]: 2025-10-02 12:58:02.485 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:58:04 np0005466012 nova_compute[192063]: 2025-10-02 12:58:04.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:05 np0005466012 nova_compute[192063]: 2025-10-02 12:58:05.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:06 np0005466012 nova_compute[192063]: 2025-10-02 12:58:06.487 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:06 np0005466012 nova_compute[192063]: 2025-10-02 12:58:06.487 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:07 np0005466012 nova_compute[192063]: 2025-10-02 12:58:07.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:07 np0005466012 nova_compute[192063]: 2025-10-02 12:58:07.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:58:07 np0005466012 nova_compute[192063]: 2025-10-02 12:58:07.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:58:07 np0005466012 nova_compute[192063]: 2025-10-02 12:58:07.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:58:09 np0005466012 nova_compute[192063]: 2025-10-02 12:58:09.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:10 np0005466012 nova_compute[192063]: 2025-10-02 12:58:10.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:10 np0005466012 nova_compute[192063]: 2025-10-02 12:58:10.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:58:10 np0005466012 nova_compute[192063]: 2025-10-02 12:58:10.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:11 np0005466012 podman[260106]: 2025-10-02 12:58:11.133207124 +0000 UTC m=+0.050734023 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  2 08:58:11 np0005466012 podman[260107]: 2025-10-02 12:58:11.136632259 +0000 UTC m=+0.049683135 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 08:58:14 np0005466012 podman[260147]: 2025-10-02 12:58:14.14661397 +0000 UTC m=+0.058534697 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 08:58:14 np0005466012 podman[260146]: 2025-10-02 12:58:14.170936463 +0000 UTC m=+0.087741135 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid)
Oct  2 08:58:14 np0005466012 nova_compute[192063]: 2025-10-02 12:58:14.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:15 np0005466012 nova_compute[192063]: 2025-10-02 12:58:15.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:19 np0005466012 nova_compute[192063]: 2025-10-02 12:58:19.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:20 np0005466012 nova_compute[192063]: 2025-10-02 12:58:20.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:24 np0005466012 nova_compute[192063]: 2025-10-02 12:58:24.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:25 np0005466012 nova_compute[192063]: 2025-10-02 12:58:25.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:28 np0005466012 podman[260192]: 2025-10-02 12:58:28.138575303 +0000 UTC m=+0.051808482 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 08:58:28 np0005466012 podman[260191]: 2025-10-02 12:58:28.152590241 +0000 UTC m=+0.063640970 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:58:28 np0005466012 podman[260190]: 2025-10-02 12:58:28.164620033 +0000 UTC m=+0.072502605 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  2 08:58:28 np0005466012 podman[260193]: 2025-10-02 12:58:28.18332843 +0000 UTC m=+0.095462079 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 08:58:29 np0005466012 nova_compute[192063]: 2025-10-02 12:58:29.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:30 np0005466012 nova_compute[192063]: 2025-10-02 12:58:30.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:34 np0005466012 nova_compute[192063]: 2025-10-02 12:58:34.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:35 np0005466012 nova_compute[192063]: 2025-10-02 12:58:35.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:39 np0005466012 nova_compute[192063]: 2025-10-02 12:58:39.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:40 np0005466012 nova_compute[192063]: 2025-10-02 12:58:40.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:42 np0005466012 podman[260277]: 2025-10-02 12:58:42.150485446 +0000 UTC m=+0.063889967 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  2 08:58:42 np0005466012 podman[260278]: 2025-10-02 12:58:42.154695362 +0000 UTC m=+0.061019536 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Oct  2 08:58:43 np0005466012 nova_compute[192063]: 2025-10-02 12:58:43.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:44 np0005466012 nova_compute[192063]: 2025-10-02 12:58:44.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:45 np0005466012 podman[260318]: 2025-10-02 12:58:45.130776778 +0000 UTC m=+0.046312100 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:58:45 np0005466012 podman[260317]: 2025-10-02 12:58:45.133062842 +0000 UTC m=+0.049808848 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 08:58:45 np0005466012 nova_compute[192063]: 2025-10-02 12:58:45.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:49 np0005466012 nova_compute[192063]: 2025-10-02 12:58:49.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:50 np0005466012 nova_compute[192063]: 2025-10-02 12:58:50.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:54 np0005466012 nova_compute[192063]: 2025-10-02 12:58:54.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:55 np0005466012 nova_compute[192063]: 2025-10-02 12:58:55.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:58:56 np0005466012 nova_compute[192063]: 2025-10-02 12:58:56.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:57 np0005466012 nova_compute[192063]: 2025-10-02 12:58:57.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:58:59 np0005466012 podman[260370]: 2025-10-02 12:58:59.135458151 +0000 UTC m=+0.046634790 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  2 08:58:59 np0005466012 podman[260371]: 2025-10-02 12:58:59.138261519 +0000 UTC m=+0.048955664 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 08:58:59 np0005466012 podman[260369]: 2025-10-02 12:58:59.138599748 +0000 UTC m=+0.054835537 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 08:58:59 np0005466012 podman[260372]: 2025-10-02 12:58:59.171184839 +0000 UTC m=+0.078119600 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 08:58:59 np0005466012 nova_compute[192063]: 2025-10-02 12:58:59.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:00 np0005466012 nova_compute[192063]: 2025-10-02 12:59:00.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:59:02.177 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:59:02.177 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 12:59:02.177 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.854 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.855 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.997 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.997 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.22999572753906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.998 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 08:59:02 np0005466012 nova_compute[192063]: 2025-10-02 12:59:02.998 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 08:59:03 np0005466012 nova_compute[192063]: 2025-10-02 12:59:03.081 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 08:59:03 np0005466012 nova_compute[192063]: 2025-10-02 12:59:03.082 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 08:59:03 np0005466012 nova_compute[192063]: 2025-10-02 12:59:03.101 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 08:59:03 np0005466012 nova_compute[192063]: 2025-10-02 12:59:03.116 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 08:59:03 np0005466012 nova_compute[192063]: 2025-10-02 12:59:03.118 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 08:59:03 np0005466012 nova_compute[192063]: 2025-10-02 12:59:03.118 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 08:59:04 np0005466012 nova_compute[192063]: 2025-10-02 12:59:04.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:05 np0005466012 nova_compute[192063]: 2025-10-02 12:59:05.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:07 np0005466012 nova_compute[192063]: 2025-10-02 12:59:07.118 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:07 np0005466012 nova_compute[192063]: 2025-10-02 12:59:07.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:09 np0005466012 nova_compute[192063]: 2025-10-02 12:59:09.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:09 np0005466012 nova_compute[192063]: 2025-10-02 12:59:09.826 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:09 np0005466012 nova_compute[192063]: 2025-10-02 12:59:09.827 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 08:59:09 np0005466012 nova_compute[192063]: 2025-10-02 12:59:09.827 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 08:59:09 np0005466012 nova_compute[192063]: 2025-10-02 12:59:09.854 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 08:59:10 np0005466012 nova_compute[192063]: 2025-10-02 12:59:10.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:12 np0005466012 nova_compute[192063]: 2025-10-02 12:59:12.825 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:12 np0005466012 nova_compute[192063]: 2025-10-02 12:59:12.826 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 08:59:13 np0005466012 podman[260457]: 2025-10-02 12:59:13.132521175 +0000 UTC m=+0.050096105 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Oct  2 08:59:13 np0005466012 podman[260456]: 2025-10-02 12:59:13.132742781 +0000 UTC m=+0.052295745 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Oct  2 08:59:14 np0005466012 nova_compute[192063]: 2025-10-02 12:59:14.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:15 np0005466012 nova_compute[192063]: 2025-10-02 12:59:15.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:16 np0005466012 podman[260497]: 2025-10-02 12:59:16.136513353 +0000 UTC m=+0.049043267 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:59:16 np0005466012 podman[260496]: 2025-10-02 12:59:16.137113339 +0000 UTC m=+0.053002485 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0)
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 12:59:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 08:59:19 np0005466012 nova_compute[192063]: 2025-10-02 12:59:19.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:20 np0005466012 nova_compute[192063]: 2025-10-02 12:59:20.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:23 np0005466012 nova_compute[192063]: 2025-10-02 12:59:23.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:24 np0005466012 nova_compute[192063]: 2025-10-02 12:59:24.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:25 np0005466012 nova_compute[192063]: 2025-10-02 12:59:25.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:29 np0005466012 nova_compute[192063]: 2025-10-02 12:59:29.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:30 np0005466012 podman[260539]: 2025-10-02 12:59:30.13882768 +0000 UTC m=+0.056017699 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 08:59:30 np0005466012 podman[260541]: 2025-10-02 12:59:30.141525395 +0000 UTC m=+0.049484199 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 08:59:30 np0005466012 podman[260540]: 2025-10-02 12:59:30.14312089 +0000 UTC m=+0.055192977 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct  2 08:59:30 np0005466012 podman[260547]: 2025-10-02 12:59:30.21373615 +0000 UTC m=+0.115009348 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 08:59:30 np0005466012 nova_compute[192063]: 2025-10-02 12:59:30.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:34 np0005466012 nova_compute[192063]: 2025-10-02 12:59:34.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:35 np0005466012 nova_compute[192063]: 2025-10-02 12:59:35.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:39 np0005466012 nova_compute[192063]: 2025-10-02 12:59:39.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:40 np0005466012 nova_compute[192063]: 2025-10-02 12:59:40.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:44 np0005466012 podman[260626]: 2025-10-02 12:59:44.126827055 +0000 UTC m=+0.048595843 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 08:59:44 np0005466012 podman[260627]: 2025-10-02 12:59:44.133393177 +0000 UTC m=+0.051610577 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container)
Oct  2 08:59:44 np0005466012 nova_compute[192063]: 2025-10-02 12:59:44.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:45 np0005466012 nova_compute[192063]: 2025-10-02 12:59:45.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:45 np0005466012 nova_compute[192063]: 2025-10-02 12:59:45.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:47 np0005466012 podman[260667]: 2025-10-02 12:59:47.124497668 +0000 UTC m=+0.045042146 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid)
Oct  2 08:59:47 np0005466012 podman[260668]: 2025-10-02 12:59:47.135866702 +0000 UTC m=+0.049282132 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 08:59:49 np0005466012 nova_compute[192063]: 2025-10-02 12:59:49.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:50 np0005466012 nova_compute[192063]: 2025-10-02 12:59:50.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:54 np0005466012 nova_compute[192063]: 2025-10-02 12:59:54.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:55 np0005466012 nova_compute[192063]: 2025-10-02 12:59:55.116 2 DEBUG oslo_concurrency.processutils [None req-659d4632-c79e-4633-9620-923cb0652eb1 6f66e2b43c7641758f7c71dec37ebcb6 c543175414e2485bb476e4dfce01c394 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  2 08:59:55 np0005466012 nova_compute[192063]: 2025-10-02 12:59:55.138 2 DEBUG oslo_concurrency.processutils [None req-659d4632-c79e-4633-9620-923cb0652eb1 6f66e2b43c7641758f7c71dec37ebcb6 c543175414e2485bb476e4dfce01c394 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  2 08:59:55 np0005466012 nova_compute[192063]: 2025-10-02 12:59:55.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 08:59:57 np0005466012 nova_compute[192063]: 2025-10-02 12:59:57.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:57 np0005466012 nova_compute[192063]: 2025-10-02 12:59:57.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 08:59:59 np0005466012 nova_compute[192063]: 2025-10-02 12:59:59.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:00 np0005466012 nova_compute[192063]: 2025-10-02 13:00:00.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:01 np0005466012 podman[260715]: 2025-10-02 13:00:01.136293698 +0000 UTC m=+0.051337069 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:00:01 np0005466012 podman[260716]: 2025-10-02 13:00:01.136334939 +0000 UTC m=+0.047723050 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct  2 09:00:01 np0005466012 podman[260717]: 2025-10-02 13:00:01.165470034 +0000 UTC m=+0.075064705 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:00:01 np0005466012 podman[260718]: 2025-10-02 13:00:01.171364447 +0000 UTC m=+0.074805598 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller)
Oct  2 09:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:00:02.178 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:00:02.179 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:00:02.179 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:00:02.187 103246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:1e:eb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '76:55:7f:40:de:c3'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  2 09:00:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:00:02.188 103246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  2 09:00:02 np0005466012 nova_compute[192063]: 2025-10-02 13:00:02.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:02 np0005466012 nova_compute[192063]: 2025-10-02 13:00:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:02 np0005466012 nova_compute[192063]: 2025-10-02 13:00:02.873 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:02 np0005466012 nova_compute[192063]: 2025-10-02 13:00:02.874 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:02 np0005466012 nova_compute[192063]: 2025-10-02 13:00:02.874 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:02 np0005466012 nova_compute[192063]: 2025-10-02 13:00:02.874 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.058 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.059 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5697MB free_disk=73.2299919128418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.060 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.060 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.228 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.229 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.285 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.313 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.316 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:00:03 np0005466012 nova_compute[192063]: 2025-10-02 13:00:03.316 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:00:04 np0005466012 nova_compute[192063]: 2025-10-02 13:00:04.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:04 np0005466012 nova_compute[192063]: 2025-10-02 13:00:04.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:04 np0005466012 nova_compute[192063]: 2025-10-02 13:00:04.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:04 np0005466012 nova_compute[192063]: 2025-10-02 13:00:04.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:00:05 np0005466012 nova_compute[192063]: 2025-10-02 13:00:05.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:07 np0005466012 nova_compute[192063]: 2025-10-02 13:00:07.318 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:08 np0005466012 nova_compute[192063]: 2025-10-02 13:00:08.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:09 np0005466012 nova_compute[192063]: 2025-10-02 13:00:09.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:10 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:00:10.190 103246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ef6a8be5-dcfe-4652-b22e-0ba81a5a76ec, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  2 09:00:10 np0005466012 nova_compute[192063]: 2025-10-02 13:00:10.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:11 np0005466012 nova_compute[192063]: 2025-10-02 13:00:11.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:11 np0005466012 nova_compute[192063]: 2025-10-02 13:00:11.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:00:11 np0005466012 nova_compute[192063]: 2025-10-02 13:00:11.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:00:13 np0005466012 nova_compute[192063]: 2025-10-02 13:00:13.190 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:00:13 np0005466012 nova_compute[192063]: 2025-10-02 13:00:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:13 np0005466012 nova_compute[192063]: 2025-10-02 13:00:13.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:00:14 np0005466012 nova_compute[192063]: 2025-10-02 13:00:14.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:14 np0005466012 podman[260797]: 2025-10-02 13:00:14.791335501 +0000 UTC m=+0.063650999 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct  2 09:00:14 np0005466012 podman[260798]: 2025-10-02 13:00:14.812071814 +0000 UTC m=+0.079750065 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Oct  2 09:00:15 np0005466012 nova_compute[192063]: 2025-10-02 13:00:15.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:18 np0005466012 podman[260836]: 2025-10-02 13:00:18.146639484 +0000 UTC m=+0.063017011 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 09:00:18 np0005466012 podman[260835]: 2025-10-02 13:00:18.153206076 +0000 UTC m=+0.070080127 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:00:19 np0005466012 nova_compute[192063]: 2025-10-02 13:00:19.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:20 np0005466012 nova_compute[192063]: 2025-10-02 13:00:20.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:23 np0005466012 nova_compute[192063]: 2025-10-02 13:00:23.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:23 np0005466012 nova_compute[192063]: 2025-10-02 13:00:23.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:00:23 np0005466012 nova_compute[192063]: 2025-10-02 13:00:23.856 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:00:24 np0005466012 nova_compute[192063]: 2025-10-02 13:00:24.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:25 np0005466012 nova_compute[192063]: 2025-10-02 13:00:25.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:29 np0005466012 nova_compute[192063]: 2025-10-02 13:00:29.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:30 np0005466012 nova_compute[192063]: 2025-10-02 13:00:30.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:32 np0005466012 podman[260881]: 2025-10-02 13:00:32.159980337 +0000 UTC m=+0.064087401 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:00:32 np0005466012 podman[260882]: 2025-10-02 13:00:32.166665932 +0000 UTC m=+0.060796571 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:00:32 np0005466012 podman[260880]: 2025-10-02 13:00:32.167133045 +0000 UTC m=+0.075962900 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:00:32 np0005466012 podman[260893]: 2025-10-02 13:00:32.212595291 +0000 UTC m=+0.100016324 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  2 09:00:34 np0005466012 nova_compute[192063]: 2025-10-02 13:00:34.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:35 np0005466012 nova_compute[192063]: 2025-10-02 13:00:35.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:39 np0005466012 nova_compute[192063]: 2025-10-02 13:00:39.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:40 np0005466012 nova_compute[192063]: 2025-10-02 13:00:40.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:44 np0005466012 nova_compute[192063]: 2025-10-02 13:00:44.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:45 np0005466012 podman[260965]: 2025-10-02 13:00:45.138178982 +0000 UTC m=+0.052233264 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350)
Oct  2 09:00:45 np0005466012 podman[260964]: 2025-10-02 13:00:45.142627685 +0000 UTC m=+0.060853773 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  2 09:00:45 np0005466012 nova_compute[192063]: 2025-10-02 13:00:45.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:46 np0005466012 nova_compute[192063]: 2025-10-02 13:00:46.856 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:49 np0005466012 podman[261001]: 2025-10-02 13:00:49.129855828 +0000 UTC m=+0.048473720 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 09:00:49 np0005466012 podman[261000]: 2025-10-02 13:00:49.160644789 +0000 UTC m=+0.082367297 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:00:49 np0005466012 nova_compute[192063]: 2025-10-02 13:00:49.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:51 np0005466012 nova_compute[192063]: 2025-10-02 13:00:51.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:54 np0005466012 nova_compute[192063]: 2025-10-02 13:00:54.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:56 np0005466012 nova_compute[192063]: 2025-10-02 13:00:56.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:00:57 np0005466012 nova_compute[192063]: 2025-10-02 13:00:57.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:58 np0005466012 nova_compute[192063]: 2025-10-02 13:00:58.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:00:59 np0005466012 nova_compute[192063]: 2025-10-02 13:00:59.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:01 np0005466012 nova_compute[192063]: 2025-10-02 13:01:01.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:01:02.181 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:01:02.182 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:01:02.182 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:02 np0005466012 nova_compute[192063]: 2025-10-02 13:01:02.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:02 np0005466012 nova_compute[192063]: 2025-10-02 13:01:02.855 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:02 np0005466012 nova_compute[192063]: 2025-10-02 13:01:02.856 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:02 np0005466012 nova_compute[192063]: 2025-10-02 13:01:02.856 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:02 np0005466012 nova_compute[192063]: 2025-10-02 13:01:02.856 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.018 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.019 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5702MB free_disk=73.24386596679688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.019 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.020 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.086 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.086 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.122 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:01:03 np0005466012 podman[261054]: 2025-10-02 13:01:03.132411685 +0000 UTC m=+0.052234654 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.136 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.137 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:01:03 np0005466012 nova_compute[192063]: 2025-10-02 13:01:03.138 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:01:03 np0005466012 podman[261055]: 2025-10-02 13:01:03.157817197 +0000 UTC m=+0.077639936 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:01:03 np0005466012 podman[261056]: 2025-10-02 13:01:03.218269337 +0000 UTC m=+0.117304072 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:01:03 np0005466012 podman[261057]: 2025-10-02 13:01:03.224942782 +0000 UTC m=+0.134380314 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  2 09:01:04 np0005466012 nova_compute[192063]: 2025-10-02 13:01:04.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:06 np0005466012 nova_compute[192063]: 2025-10-02 13:01:06.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:06 np0005466012 nova_compute[192063]: 2025-10-02 13:01:06.132 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:07 np0005466012 nova_compute[192063]: 2025-10-02 13:01:07.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:07 np0005466012 nova_compute[192063]: 2025-10-02 13:01:07.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:09 np0005466012 nova_compute[192063]: 2025-10-02 13:01:09.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:09 np0005466012 nova_compute[192063]: 2025-10-02 13:01:09.844 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:11 np0005466012 nova_compute[192063]: 2025-10-02 13:01:11.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:12 np0005466012 nova_compute[192063]: 2025-10-02 13:01:12.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:12 np0005466012 nova_compute[192063]: 2025-10-02 13:01:12.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:01:12 np0005466012 nova_compute[192063]: 2025-10-02 13:01:12.824 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:01:12 np0005466012 nova_compute[192063]: 2025-10-02 13:01:12.884 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:01:13 np0005466012 nova_compute[192063]: 2025-10-02 13:01:13.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:13 np0005466012 nova_compute[192063]: 2025-10-02 13:01:13.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:01:14 np0005466012 nova_compute[192063]: 2025-10-02 13:01:14.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:16 np0005466012 nova_compute[192063]: 2025-10-02 13:01:16.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:16 np0005466012 podman[261138]: 2025-10-02 13:01:16.188722976 +0000 UTC m=+0.057238332 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  2 09:01:16 np0005466012 podman[261139]: 2025-10-02 13:01:16.19176752 +0000 UTC m=+0.057824738 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal)
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:01:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:01:19 np0005466012 nova_compute[192063]: 2025-10-02 13:01:19.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:20 np0005466012 podman[261175]: 2025-10-02 13:01:20.151470983 +0000 UTC m=+0.066673124 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 09:01:20 np0005466012 podman[261176]: 2025-10-02 13:01:20.159083923 +0000 UTC m=+0.066036426 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 09:01:21 np0005466012 nova_compute[192063]: 2025-10-02 13:01:21.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:24 np0005466012 nova_compute[192063]: 2025-10-02 13:01:24.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:26 np0005466012 nova_compute[192063]: 2025-10-02 13:01:26.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:27 np0005466012 nova_compute[192063]: 2025-10-02 13:01:27.818 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:29 np0005466012 nova_compute[192063]: 2025-10-02 13:01:29.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:31 np0005466012 nova_compute[192063]: 2025-10-02 13:01:31.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:34 np0005466012 podman[261222]: 2025-10-02 13:01:34.161510356 +0000 UTC m=+0.066437567 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:01:34 np0005466012 podman[261223]: 2025-10-02 13:01:34.162248686 +0000 UTC m=+0.062772055 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:01:34 np0005466012 podman[261221]: 2025-10-02 13:01:34.163196873 +0000 UTC m=+0.067704302 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible)
Oct  2 09:01:34 np0005466012 podman[261224]: 2025-10-02 13:01:34.24161438 +0000 UTC m=+0.128867892 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:01:34 np0005466012 nova_compute[192063]: 2025-10-02 13:01:34.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:36 np0005466012 nova_compute[192063]: 2025-10-02 13:01:36.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:39 np0005466012 nova_compute[192063]: 2025-10-02 13:01:39.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:41 np0005466012 nova_compute[192063]: 2025-10-02 13:01:41.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:44 np0005466012 nova_compute[192063]: 2025-10-02 13:01:44.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:46 np0005466012 nova_compute[192063]: 2025-10-02 13:01:46.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:47 np0005466012 podman[261306]: 2025-10-02 13:01:47.140510021 +0000 UTC m=+0.055955257 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  2 09:01:47 np0005466012 podman[261305]: 2025-10-02 13:01:47.140546592 +0000 UTC m=+0.055991888 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:01:47 np0005466012 nova_compute[192063]: 2025-10-02 13:01:47.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:49 np0005466012 nova_compute[192063]: 2025-10-02 13:01:49.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:51 np0005466012 nova_compute[192063]: 2025-10-02 13:01:51.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:51 np0005466012 podman[261343]: 2025-10-02 13:01:51.14390568 +0000 UTC m=+0.063905197 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3)
Oct  2 09:01:51 np0005466012 podman[261344]: 2025-10-02 13:01:51.156691344 +0000 UTC m=+0.070085398 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:01:54 np0005466012 nova_compute[192063]: 2025-10-02 13:01:54.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:56 np0005466012 nova_compute[192063]: 2025-10-02 13:01:56.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:01:58 np0005466012 nova_compute[192063]: 2025-10-02 13:01:58.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:01:59 np0005466012 nova_compute[192063]: 2025-10-02 13:01:59.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:00 np0005466012 nova_compute[192063]: 2025-10-02 13:02:00.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:01 np0005466012 nova_compute[192063]: 2025-10-02 13:02:01.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:02:02.183 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:02:02.183 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:02:02.183 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:02 np0005466012 nova_compute[192063]: 2025-10-02 13:02:02.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:02 np0005466012 nova_compute[192063]: 2025-10-02 13:02:02.892 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:02 np0005466012 nova_compute[192063]: 2025-10-02 13:02:02.893 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:02 np0005466012 nova_compute[192063]: 2025-10-02 13:02:02.893 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:02 np0005466012 nova_compute[192063]: 2025-10-02 13:02:02.894 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.098 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.099 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5701MB free_disk=73.24386596679688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.099 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.099 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.168 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.168 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.182 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.196 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.196 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.213 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.228 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.255 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.278 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.281 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:02:03 np0005466012 nova_compute[192063]: 2025-10-02 13:02:03.282 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:02:04 np0005466012 nova_compute[192063]: 2025-10-02 13:02:04.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:05 np0005466012 podman[261391]: 2025-10-02 13:02:05.14447313 +0000 UTC m=+0.049840298 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:02:05 np0005466012 podman[261390]: 2025-10-02 13:02:05.144807259 +0000 UTC m=+0.053180620 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  2 09:02:05 np0005466012 podman[261389]: 2025-10-02 13:02:05.167659171 +0000 UTC m=+0.070380746 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:02:05 np0005466012 podman[261392]: 2025-10-02 13:02:05.212581872 +0000 UTC m=+0.112622693 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  2 09:02:06 np0005466012 nova_compute[192063]: 2025-10-02 13:02:06.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:07 np0005466012 nova_compute[192063]: 2025-10-02 13:02:07.280 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:09 np0005466012 nova_compute[192063]: 2025-10-02 13:02:09.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:10 np0005466012 nova_compute[192063]: 2025-10-02 13:02:10.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:10 np0005466012 nova_compute[192063]: 2025-10-02 13:02:10.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:11 np0005466012 nova_compute[192063]: 2025-10-02 13:02:11.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:12 np0005466012 nova_compute[192063]: 2025-10-02 13:02:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:12 np0005466012 nova_compute[192063]: 2025-10-02 13:02:12.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:02:12 np0005466012 nova_compute[192063]: 2025-10-02 13:02:12.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:02:12 np0005466012 nova_compute[192063]: 2025-10-02 13:02:12.838 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:02:15 np0005466012 nova_compute[192063]: 2025-10-02 13:02:15.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:15 np0005466012 nova_compute[192063]: 2025-10-02 13:02:15.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:15 np0005466012 nova_compute[192063]: 2025-10-02 13:02:15.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:02:16 np0005466012 nova_compute[192063]: 2025-10-02 13:02:16.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:18 np0005466012 podman[261476]: 2025-10-02 13:02:18.173815008 +0000 UTC m=+0.082780319 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Oct  2 09:02:18 np0005466012 podman[261477]: 2025-10-02 13:02:18.200220317 +0000 UTC m=+0.103067479 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  2 09:02:20 np0005466012 nova_compute[192063]: 2025-10-02 13:02:20.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:21 np0005466012 nova_compute[192063]: 2025-10-02 13:02:21.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:22 np0005466012 podman[261515]: 2025-10-02 13:02:22.149658916 +0000 UTC m=+0.063480055 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  2 09:02:22 np0005466012 podman[261516]: 2025-10-02 13:02:22.164140076 +0000 UTC m=+0.064726929 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  2 09:02:25 np0005466012 nova_compute[192063]: 2025-10-02 13:02:25.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:26 np0005466012 nova_compute[192063]: 2025-10-02 13:02:26.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:30 np0005466012 nova_compute[192063]: 2025-10-02 13:02:30.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:31 np0005466012 nova_compute[192063]: 2025-10-02 13:02:31.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:35 np0005466012 nova_compute[192063]: 2025-10-02 13:02:35.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:36 np0005466012 podman[261559]: 2025-10-02 13:02:36.138960855 +0000 UTC m=+0.058908769 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:02:36 np0005466012 podman[261560]: 2025-10-02 13:02:36.145158606 +0000 UTC m=+0.056049840 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:02:36 np0005466012 podman[261567]: 2025-10-02 13:02:36.177767387 +0000 UTC m=+0.084317080 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:02:36 np0005466012 podman[261561]: 2025-10-02 13:02:36.17860333 +0000 UTC m=+0.089830123 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 09:02:36 np0005466012 nova_compute[192063]: 2025-10-02 13:02:36.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:40 np0005466012 nova_compute[192063]: 2025-10-02 13:02:40.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:41 np0005466012 nova_compute[192063]: 2025-10-02 13:02:41.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:45 np0005466012 nova_compute[192063]: 2025-10-02 13:02:45.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:46 np0005466012 nova_compute[192063]: 2025-10-02 13:02:46.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:47 np0005466012 nova_compute[192063]: 2025-10-02 13:02:47.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:02:49 np0005466012 podman[261645]: 2025-10-02 13:02:49.147398602 +0000 UTC m=+0.065994934 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  2 09:02:49 np0005466012 podman[261646]: 2025-10-02 13:02:49.153781169 +0000 UTC m=+0.067293141 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=)
Oct  2 09:02:50 np0005466012 nova_compute[192063]: 2025-10-02 13:02:50.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:51 np0005466012 nova_compute[192063]: 2025-10-02 13:02:51.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:53 np0005466012 podman[261689]: 2025-10-02 13:02:53.155642315 +0000 UTC m=+0.066715124 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:02:53 np0005466012 podman[261688]: 2025-10-02 13:02:53.167664507 +0000 UTC m=+0.089642018 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  2 09:02:55 np0005466012 nova_compute[192063]: 2025-10-02 13:02:55.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:56 np0005466012 nova_compute[192063]: 2025-10-02 13:02:56.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:02:58 np0005466012 nova_compute[192063]: 2025-10-02 13:02:58.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:00 np0005466012 nova_compute[192063]: 2025-10-02 13:03:00.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:00 np0005466012 nova_compute[192063]: 2025-10-02 13:03:00.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:01 np0005466012 nova_compute[192063]: 2025-10-02 13:03:01.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:03:02.183 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:03:02.184 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:03:02.184 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:02 np0005466012 nova_compute[192063]: 2025-10-02 13:03:02.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:02 np0005466012 nova_compute[192063]: 2025-10-02 13:03:02.857 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:02 np0005466012 nova_compute[192063]: 2025-10-02 13:03:02.858 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:02 np0005466012 nova_compute[192063]: 2025-10-02 13:03:02.858 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:02 np0005466012 nova_compute[192063]: 2025-10-02 13:03:02.859 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.025 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.026 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5714MB free_disk=73.2438850402832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.027 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.027 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.165 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.165 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.247 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.264 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.265 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:03:03 np0005466012 nova_compute[192063]: 2025-10-02 13:03:03.266 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:03:05 np0005466012 nova_compute[192063]: 2025-10-02 13:03:05.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:06 np0005466012 nova_compute[192063]: 2025-10-02 13:03:06.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:07 np0005466012 podman[261732]: 2025-10-02 13:03:07.151511056 +0000 UTC m=+0.060020469 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:03:07 np0005466012 podman[261733]: 2025-10-02 13:03:07.168684261 +0000 UTC m=+0.063088274 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:03:07 np0005466012 podman[261734]: 2025-10-02 13:03:07.178952505 +0000 UTC m=+0.078502600 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  2 09:03:07 np0005466012 podman[261731]: 2025-10-02 13:03:07.179318025 +0000 UTC m=+0.088683252 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:03:09 np0005466012 nova_compute[192063]: 2025-10-02 13:03:09.261 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:10 np0005466012 nova_compute[192063]: 2025-10-02 13:03:10.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:11 np0005466012 nova_compute[192063]: 2025-10-02 13:03:11.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:11 np0005466012 nova_compute[192063]: 2025-10-02 13:03:11.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:11 np0005466012 nova_compute[192063]: 2025-10-02 13:03:11.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:13 np0005466012 nova_compute[192063]: 2025-10-02 13:03:13.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:13 np0005466012 nova_compute[192063]: 2025-10-02 13:03:13.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:03:13 np0005466012 nova_compute[192063]: 2025-10-02 13:03:13.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:03:13 np0005466012 nova_compute[192063]: 2025-10-02 13:03:13.839 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:03:15 np0005466012 nova_compute[192063]: 2025-10-02 13:03:15.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:15 np0005466012 nova_compute[192063]: 2025-10-02 13:03:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:15 np0005466012 nova_compute[192063]: 2025-10-02 13:03:15.821 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:03:16 np0005466012 nova_compute[192063]: 2025-10-02 13:03:16.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.931 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:03:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:03:20 np0005466012 nova_compute[192063]: 2025-10-02 13:03:20.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:20 np0005466012 podman[261815]: 2025-10-02 13:03:20.133497103 +0000 UTC m=+0.053952591 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Oct  2 09:03:20 np0005466012 podman[261816]: 2025-10-02 13:03:20.146177154 +0000 UTC m=+0.060275407 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct  2 09:03:21 np0005466012 nova_compute[192063]: 2025-10-02 13:03:21.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:24 np0005466012 podman[261856]: 2025-10-02 13:03:24.133833327 +0000 UTC m=+0.048460170 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:03:24 np0005466012 podman[261857]: 2025-10-02 13:03:24.146840457 +0000 UTC m=+0.057387097 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  2 09:03:25 np0005466012 nova_compute[192063]: 2025-10-02 13:03:25.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:26 np0005466012 nova_compute[192063]: 2025-10-02 13:03:26.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:30 np0005466012 nova_compute[192063]: 2025-10-02 13:03:30.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:31 np0005466012 nova_compute[192063]: 2025-10-02 13:03:31.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:32 np0005466012 nova_compute[192063]: 2025-10-02 13:03:32.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:35 np0005466012 nova_compute[192063]: 2025-10-02 13:03:35.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:36 np0005466012 nova_compute[192063]: 2025-10-02 13:03:36.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:38 np0005466012 podman[261902]: 2025-10-02 13:03:38.137651497 +0000 UTC m=+0.044864181 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  2 09:03:38 np0005466012 podman[261901]: 2025-10-02 13:03:38.155397008 +0000 UTC m=+0.058402775 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:38 np0005466012 podman[261900]: 2025-10-02 13:03:38.161613369 +0000 UTC m=+0.066877859 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  2 09:03:38 np0005466012 podman[261903]: 2025-10-02 13:03:38.175502683 +0000 UTC m=+0.082192982 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:03:40 np0005466012 nova_compute[192063]: 2025-10-02 13:03:40.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:41 np0005466012 nova_compute[192063]: 2025-10-02 13:03:41.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:45 np0005466012 nova_compute[192063]: 2025-10-02 13:03:45.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:46 np0005466012 nova_compute[192063]: 2025-10-02 13:03:46.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:47 np0005466012 nova_compute[192063]: 2025-10-02 13:03:47.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:03:50 np0005466012 nova_compute[192063]: 2025-10-02 13:03:50.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:51 np0005466012 podman[261986]: 2025-10-02 13:03:51.149625503 +0000 UTC m=+0.059127985 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  2 09:03:51 np0005466012 podman[261987]: 2025-10-02 13:03:51.161500701 +0000 UTC m=+0.061594263 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Oct  2 09:03:51 np0005466012 nova_compute[192063]: 2025-10-02 13:03:51.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:55 np0005466012 nova_compute[192063]: 2025-10-02 13:03:55.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:03:55 np0005466012 podman[262030]: 2025-10-02 13:03:55.134367516 +0000 UTC m=+0.052061739 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:03:55 np0005466012 podman[262029]: 2025-10-02 13:03:55.135521988 +0000 UTC m=+0.056413159 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:03:56 np0005466012 nova_compute[192063]: 2025-10-02 13:03:56.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:00 np0005466012 nova_compute[192063]: 2025-10-02 13:04:00.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:00 np0005466012 nova_compute[192063]: 2025-10-02 13:04:00.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:01 np0005466012 nova_compute[192063]: 2025-10-02 13:04:01.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:01 np0005466012 nova_compute[192063]: 2025-10-02 13:04:01.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:04:02.185 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:04:02.186 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:04:02.186 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.853 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.854 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.854 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.854 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.994 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.995 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.24386596679688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.995 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:04:02 np0005466012 nova_compute[192063]: 2025-10-02 13:04:02.995 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:04:03 np0005466012 nova_compute[192063]: 2025-10-02 13:04:03.060 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:04:03 np0005466012 nova_compute[192063]: 2025-10-02 13:04:03.060 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:04:03 np0005466012 nova_compute[192063]: 2025-10-02 13:04:03.079 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:04:03 np0005466012 nova_compute[192063]: 2025-10-02 13:04:03.092 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:04:03 np0005466012 nova_compute[192063]: 2025-10-02 13:04:03.093 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:04:03 np0005466012 nova_compute[192063]: 2025-10-02 13:04:03.093 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:04:05 np0005466012 nova_compute[192063]: 2025-10-02 13:04:05.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:06 np0005466012 nova_compute[192063]: 2025-10-02 13:04:06.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:09 np0005466012 podman[262071]: 2025-10-02 13:04:09.13505452 +0000 UTC m=+0.054782935 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:04:09 np0005466012 podman[262072]: 2025-10-02 13:04:09.135051029 +0000 UTC m=+0.050725372 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  2 09:04:09 np0005466012 podman[262073]: 2025-10-02 13:04:09.16078955 +0000 UTC m=+0.073916072 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:04:09 np0005466012 podman[262074]: 2025-10-02 13:04:09.170797687 +0000 UTC m=+0.079711763 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  2 09:04:10 np0005466012 nova_compute[192063]: 2025-10-02 13:04:10.088 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:10 np0005466012 nova_compute[192063]: 2025-10-02 13:04:10.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:11 np0005466012 nova_compute[192063]: 2025-10-02 13:04:11.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:12 np0005466012 nova_compute[192063]: 2025-10-02 13:04:12.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:13 np0005466012 nova_compute[192063]: 2025-10-02 13:04:13.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:13 np0005466012 nova_compute[192063]: 2025-10-02 13:04:13.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:04:13 np0005466012 nova_compute[192063]: 2025-10-02 13:04:13.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:04:13 np0005466012 nova_compute[192063]: 2025-10-02 13:04:13.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:04:13 np0005466012 nova_compute[192063]: 2025-10-02 13:04:13.841 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:15 np0005466012 nova_compute[192063]: 2025-10-02 13:04:15.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:16 np0005466012 nova_compute[192063]: 2025-10-02 13:04:16.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:16 np0005466012 nova_compute[192063]: 2025-10-02 13:04:16.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:16 np0005466012 nova_compute[192063]: 2025-10-02 13:04:16.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:04:20 np0005466012 nova_compute[192063]: 2025-10-02 13:04:20.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:21 np0005466012 nova_compute[192063]: 2025-10-02 13:04:21.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:22 np0005466012 podman[262158]: 2025-10-02 13:04:22.144667492 +0000 UTC m=+0.061204162 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:04:22 np0005466012 podman[262159]: 2025-10-02 13:04:22.153772884 +0000 UTC m=+0.070533340 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc.)
Oct  2 09:04:25 np0005466012 nova_compute[192063]: 2025-10-02 13:04:25.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:26 np0005466012 podman[262201]: 2025-10-02 13:04:26.166922623 +0000 UTC m=+0.066909810 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:04:26 np0005466012 podman[262200]: 2025-10-02 13:04:26.166991725 +0000 UTC m=+0.074407686 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  2 09:04:26 np0005466012 nova_compute[192063]: 2025-10-02 13:04:26.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:30 np0005466012 nova_compute[192063]: 2025-10-02 13:04:30.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:31 np0005466012 nova_compute[192063]: 2025-10-02 13:04:31.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:35 np0005466012 nova_compute[192063]: 2025-10-02 13:04:35.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:36 np0005466012 nova_compute[192063]: 2025-10-02 13:04:36.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:40 np0005466012 podman[262243]: 2025-10-02 13:04:40.132494015 +0000 UTC m=+0.047295568 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:04:40 np0005466012 podman[262242]: 2025-10-02 13:04:40.135333053 +0000 UTC m=+0.053153600 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  2 09:04:40 np0005466012 podman[262244]: 2025-10-02 13:04:40.149421932 +0000 UTC m=+0.057756126 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:04:40 np0005466012 nova_compute[192063]: 2025-10-02 13:04:40.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:40 np0005466012 podman[262245]: 2025-10-02 13:04:40.190904489 +0000 UTC m=+0.095505940 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:04:41 np0005466012 nova_compute[192063]: 2025-10-02 13:04:41.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:45 np0005466012 nova_compute[192063]: 2025-10-02 13:04:45.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:46 np0005466012 nova_compute[192063]: 2025-10-02 13:04:46.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:49 np0005466012 nova_compute[192063]: 2025-10-02 13:04:49.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:04:50 np0005466012 nova_compute[192063]: 2025-10-02 13:04:50.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:51 np0005466012 nova_compute[192063]: 2025-10-02 13:04:51.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:53 np0005466012 podman[262325]: 2025-10-02 13:04:53.15488641 +0000 UTC m=+0.069965455 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  2 09:04:53 np0005466012 podman[262326]: 2025-10-02 13:04:53.162080978 +0000 UTC m=+0.063215937 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Oct  2 09:04:55 np0005466012 nova_compute[192063]: 2025-10-02 13:04:55.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:56 np0005466012 nova_compute[192063]: 2025-10-02 13:04:56.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:04:57 np0005466012 podman[262369]: 2025-10-02 13:04:57.132545288 +0000 UTC m=+0.046742433 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 09:04:57 np0005466012 podman[262368]: 2025-10-02 13:04:57.161513758 +0000 UTC m=+0.078117109 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid)
Oct  2 09:05:00 np0005466012 nova_compute[192063]: 2025-10-02 13:05:00.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:01 np0005466012 nova_compute[192063]: 2025-10-02 13:05:01.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:01 np0005466012 nova_compute[192063]: 2025-10-02 13:05:01.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:05:02.187 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:05:02.188 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:05:02.188 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:02 np0005466012 nova_compute[192063]: 2025-10-02 13:05:02.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:04 np0005466012 nova_compute[192063]: 2025-10-02 13:05:04.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:04 np0005466012 nova_compute[192063]: 2025-10-02 13:05:04.967 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:04 np0005466012 nova_compute[192063]: 2025-10-02 13:05:04.967 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:04 np0005466012 nova_compute[192063]: 2025-10-02 13:05:04.967 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:04 np0005466012 nova_compute[192063]: 2025-10-02 13:05:04.968 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.143 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.144 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5710MB free_disk=73.2438850402832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.144 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.144 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.487 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.487 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.504 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.651 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.653 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:05:05 np0005466012 nova_compute[192063]: 2025-10-02 13:05:05.654 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:05:06 np0005466012 nova_compute[192063]: 2025-10-02 13:05:06.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:10 np0005466012 nova_compute[192063]: 2025-10-02 13:05:10.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:10 np0005466012 nova_compute[192063]: 2025-10-02 13:05:10.649 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:11 np0005466012 podman[262419]: 2025-10-02 13:05:11.163318031 +0000 UTC m=+0.065282584 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  2 09:05:11 np0005466012 podman[262420]: 2025-10-02 13:05:11.169675387 +0000 UTC m=+0.067100684 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:05:11 np0005466012 podman[262418]: 2025-10-02 13:05:11.201092605 +0000 UTC m=+0.103848640 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true)
Oct  2 09:05:11 np0005466012 podman[262421]: 2025-10-02 13:05:11.208600772 +0000 UTC m=+0.102370489 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  2 09:05:11 np0005466012 nova_compute[192063]: 2025-10-02 13:05:11.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:11 np0005466012 nova_compute[192063]: 2025-10-02 13:05:11.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:11 np0005466012 nova_compute[192063]: 2025-10-02 13:05:11.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  2 09:05:13 np0005466012 nova_compute[192063]: 2025-10-02 13:05:13.898 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:13 np0005466012 nova_compute[192063]: 2025-10-02 13:05:13.898 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:14 np0005466012 nova_compute[192063]: 2025-10-02 13:05:14.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:14 np0005466012 nova_compute[192063]: 2025-10-02 13:05:14.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:05:14 np0005466012 nova_compute[192063]: 2025-10-02 13:05:14.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:05:14 np0005466012 nova_compute[192063]: 2025-10-02 13:05:14.850 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:05:15 np0005466012 nova_compute[192063]: 2025-10-02 13:05:15.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:16 np0005466012 nova_compute[192063]: 2025-10-02 13:05:16.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:05:16.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:05:17 np0005466012 nova_compute[192063]: 2025-10-02 13:05:17.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:17 np0005466012 nova_compute[192063]: 2025-10-02 13:05:17.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:05:20 np0005466012 nova_compute[192063]: 2025-10-02 13:05:20.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:21 np0005466012 nova_compute[192063]: 2025-10-02 13:05:21.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:24 np0005466012 podman[262507]: 2025-10-02 13:05:24.138385317 +0000 UTC m=+0.054881907 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  2 09:05:24 np0005466012 podman[262508]: 2025-10-02 13:05:24.140820945 +0000 UTC m=+0.057698086 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Oct  2 09:05:24 np0005466012 nova_compute[192063]: 2025-10-02 13:05:24.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:24 np0005466012 nova_compute[192063]: 2025-10-02 13:05:24.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  2 09:05:24 np0005466012 nova_compute[192063]: 2025-10-02 13:05:24.841 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  2 09:05:25 np0005466012 nova_compute[192063]: 2025-10-02 13:05:25.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:26 np0005466012 nova_compute[192063]: 2025-10-02 13:05:26.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:28 np0005466012 podman[262549]: 2025-10-02 13:05:28.203726415 +0000 UTC m=+0.090091167 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct  2 09:05:28 np0005466012 podman[262550]: 2025-10-02 13:05:28.211113533 +0000 UTC m=+0.088257168 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:05:30 np0005466012 nova_compute[192063]: 2025-10-02 13:05:30.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:31 np0005466012 nova_compute[192063]: 2025-10-02 13:05:31.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:34 np0005466012 nova_compute[192063]: 2025-10-02 13:05:34.389 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:34 np0005466012 nova_compute[192063]: 2025-10-02 13:05:34.833 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:35 np0005466012 nova_compute[192063]: 2025-10-02 13:05:35.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:36 np0005466012 nova_compute[192063]: 2025-10-02 13:05:36.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:40 np0005466012 nova_compute[192063]: 2025-10-02 13:05:40.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:41 np0005466012 nova_compute[192063]: 2025-10-02 13:05:41.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:42 np0005466012 podman[262595]: 2025-10-02 13:05:42.147885083 +0000 UTC m=+0.063090183 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  2 09:05:42 np0005466012 podman[262597]: 2025-10-02 13:05:42.154158621 +0000 UTC m=+0.063320089 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 09:05:42 np0005466012 podman[262596]: 2025-10-02 13:05:42.165486365 +0000 UTC m=+0.076548624 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  2 09:05:42 np0005466012 podman[262598]: 2025-10-02 13:05:42.191977465 +0000 UTC m=+0.088304669 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:05:45 np0005466012 nova_compute[192063]: 2025-10-02 13:05:45.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:46 np0005466012 nova_compute[192063]: 2025-10-02 13:05:46.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:49 np0005466012 nova_compute[192063]: 2025-10-02 13:05:49.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:05:50 np0005466012 nova_compute[192063]: 2025-10-02 13:05:50.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:51 np0005466012 nova_compute[192063]: 2025-10-02 13:05:51.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:55 np0005466012 podman[262685]: 2025-10-02 13:05:55.147027185 +0000 UTC m=+0.059921198 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct  2 09:05:55 np0005466012 podman[262684]: 2025-10-02 13:05:55.16584934 +0000 UTC m=+0.083493880 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  2 09:05:55 np0005466012 nova_compute[192063]: 2025-10-02 13:05:55.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:56 np0005466012 nova_compute[192063]: 2025-10-02 13:05:56.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:05:59 np0005466012 podman[262727]: 2025-10-02 13:05:59.147465615 +0000 UTC m=+0.054117752 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:05:59 np0005466012 podman[262726]: 2025-10-02 13:05:59.152156252 +0000 UTC m=+0.060724680 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:06:00 np0005466012 nova_compute[192063]: 2025-10-02 13:06:00.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:01 np0005466012 nova_compute[192063]: 2025-10-02 13:06:01.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:06:02.189 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:06:02.190 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:06:02.190 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:03 np0005466012 nova_compute[192063]: 2025-10-02 13:06:03.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:04 np0005466012 nova_compute[192063]: 2025-10-02 13:06:04.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:04 np0005466012 nova_compute[192063]: 2025-10-02 13:06:04.823 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:04 np0005466012 nova_compute[192063]: 2025-10-02 13:06:04.857 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:04 np0005466012 nova_compute[192063]: 2025-10-02 13:06:04.859 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:04 np0005466012 nova_compute[192063]: 2025-10-02 13:06:04.859 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:04 np0005466012 nova_compute[192063]: 2025-10-02 13:06:04.860 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.007 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.008 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5711MB free_disk=73.24388122558594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.008 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.008 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.224 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.224 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.244 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.277 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.278 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:06:05 np0005466012 nova_compute[192063]: 2025-10-02 13:06:05.278 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:06:06 np0005466012 nova_compute[192063]: 2025-10-02 13:06:06.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:10 np0005466012 nova_compute[192063]: 2025-10-02 13:06:10.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:11 np0005466012 nova_compute[192063]: 2025-10-02 13:06:11.272 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:11 np0005466012 nova_compute[192063]: 2025-10-02 13:06:11.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:13 np0005466012 podman[262770]: 2025-10-02 13:06:13.147899673 +0000 UTC m=+0.052159030 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  2 09:06:13 np0005466012 podman[262769]: 2025-10-02 13:06:13.164520029 +0000 UTC m=+0.077255373 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:06:13 np0005466012 podman[262771]: 2025-10-02 13:06:13.166871862 +0000 UTC m=+0.072876556 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct  2 09:06:13 np0005466012 podman[262768]: 2025-10-02 13:06:13.170657594 +0000 UTC m=+0.082841464 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:06:13 np0005466012 nova_compute[192063]: 2025-10-02 13:06:13.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:13 np0005466012 nova_compute[192063]: 2025-10-02 13:06:13.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:14 np0005466012 nova_compute[192063]: 2025-10-02 13:06:14.840 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:14 np0005466012 nova_compute[192063]: 2025-10-02 13:06:14.840 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:06:14 np0005466012 nova_compute[192063]: 2025-10-02 13:06:14.840 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:06:14 np0005466012 nova_compute[192063]: 2025-10-02 13:06:14.864 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:06:14 np0005466012 nova_compute[192063]: 2025-10-02 13:06:14.865 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:15 np0005466012 nova_compute[192063]: 2025-10-02 13:06:15.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:16 np0005466012 nova_compute[192063]: 2025-10-02 13:06:16.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:19 np0005466012 nova_compute[192063]: 2025-10-02 13:06:19.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:19 np0005466012 nova_compute[192063]: 2025-10-02 13:06:19.823 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:06:20 np0005466012 nova_compute[192063]: 2025-10-02 13:06:20.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:21 np0005466012 nova_compute[192063]: 2025-10-02 13:06:21.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:25 np0005466012 nova_compute[192063]: 2025-10-02 13:06:25.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:25 np0005466012 podman[262853]: 2025-10-02 13:06:25.390849106 +0000 UTC m=+0.060399831 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd)
Oct  2 09:06:25 np0005466012 podman[262854]: 2025-10-02 13:06:25.393076646 +0000 UTC m=+0.058980233 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Oct  2 09:06:26 np0005466012 nova_compute[192063]: 2025-10-02 13:06:26.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:30 np0005466012 podman[262895]: 2025-10-02 13:06:30.154866636 +0000 UTC m=+0.076344140 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  2 09:06:30 np0005466012 podman[262896]: 2025-10-02 13:06:30.155360499 +0000 UTC m=+0.072973118 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  2 09:06:30 np0005466012 nova_compute[192063]: 2025-10-02 13:06:30.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:31 np0005466012 nova_compute[192063]: 2025-10-02 13:06:31.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:35 np0005466012 nova_compute[192063]: 2025-10-02 13:06:35.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:36 np0005466012 nova_compute[192063]: 2025-10-02 13:06:36.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:40 np0005466012 nova_compute[192063]: 2025-10-02 13:06:40.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:41 np0005466012 nova_compute[192063]: 2025-10-02 13:06:41.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:44 np0005466012 podman[262942]: 2025-10-02 13:06:44.165593318 +0000 UTC m=+0.074199152 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  2 09:06:44 np0005466012 podman[262943]: 2025-10-02 13:06:44.186649912 +0000 UTC m=+0.093338414 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:06:44 np0005466012 podman[262941]: 2025-10-02 13:06:44.197679788 +0000 UTC m=+0.110158035 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute)
Oct  2 09:06:44 np0005466012 podman[262944]: 2025-10-02 13:06:44.210759739 +0000 UTC m=+0.104450502 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  2 09:06:45 np0005466012 nova_compute[192063]: 2025-10-02 13:06:45.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:46 np0005466012 nova_compute[192063]: 2025-10-02 13:06:46.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:50 np0005466012 nova_compute[192063]: 2025-10-02 13:06:50.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:51 np0005466012 nova_compute[192063]: 2025-10-02 13:06:51.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:51 np0005466012 nova_compute[192063]: 2025-10-02 13:06:51.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:06:55 np0005466012 nova_compute[192063]: 2025-10-02 13:06:55.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:06:56 np0005466012 podman[263027]: 2025-10-02 13:06:56.1423139 +0000 UTC m=+0.060245556 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  2 09:06:56 np0005466012 podman[263028]: 2025-10-02 13:06:56.16653253 +0000 UTC m=+0.083231034 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  2 09:06:56 np0005466012 nova_compute[192063]: 2025-10-02 13:06:56.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:00 np0005466012 nova_compute[192063]: 2025-10-02 13:07:00.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:01 np0005466012 podman[263068]: 2025-10-02 13:07:01.131755144 +0000 UTC m=+0.047631008 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  2 09:07:01 np0005466012 podman[263069]: 2025-10-02 13:07:01.137363624 +0000 UTC m=+0.050175796 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  2 09:07:01 np0005466012 nova_compute[192063]: 2025-10-02 13:07:01.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:07:02.191 103246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:07:02.191 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:02 np0005466012 ovn_metadata_agent[103241]: 2025-10-02 13:07:02.191 103246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:05 np0005466012 nova_compute[192063]: 2025-10-02 13:07:05.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:05 np0005466012 nova_compute[192063]: 2025-10-02 13:07:05.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:05 np0005466012 nova_compute[192063]: 2025-10-02 13:07:05.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:06 np0005466012 nova_compute[192063]: 2025-10-02 13:07:06.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:06 np0005466012 nova_compute[192063]: 2025-10-02 13:07:06.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:06 np0005466012 nova_compute[192063]: 2025-10-02 13:07:06.860 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:06 np0005466012 nova_compute[192063]: 2025-10-02 13:07:06.861 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:06 np0005466012 nova_compute[192063]: 2025-10-02 13:07:06.861 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:06 np0005466012 nova_compute[192063]: 2025-10-02 13:07:06.861 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.010 2 WARNING nova.virt.libvirt.driver [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.011 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5714MB free_disk=73.24388122558594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.011 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.012 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.080 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.080 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.110 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing inventories for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.154 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating ProviderTree inventory for provider ddb6f967-9a8a-4554-9b44-b99536054f9c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.155 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Updating inventory in ProviderTree for provider ddb6f967-9a8a-4554-9b44-b99536054f9c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.178 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing aggregate associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.200 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Refreshing trait associations for resource provider ddb6f967-9a8a-4554-9b44-b99536054f9c, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.221 2 DEBUG nova.compute.provider_tree [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed in ProviderTree for provider: ddb6f967-9a8a-4554-9b44-b99536054f9c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.238 2 DEBUG nova.scheduler.client.report [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Inventory has not changed for provider ddb6f967-9a8a-4554-9b44-b99536054f9c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.240 2 DEBUG nova.compute.resource_tracker [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  2 09:07:07 np0005466012 nova_compute[192063]: 2025-10-02 13:07:07.240 2 DEBUG oslo_concurrency.lockutils [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  2 09:07:10 np0005466012 nova_compute[192063]: 2025-10-02 13:07:10.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:11 np0005466012 nova_compute[192063]: 2025-10-02 13:07:11.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:13 np0005466012 nova_compute[192063]: 2025-10-02 13:07:13.237 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:15 np0005466012 podman[263110]: 2025-10-02 13:07:15.136115448 +0000 UTC m=+0.047965457 container health_status 9e3df53334372e0c8e1eb9a9818b1724584396d7d89f4bede1e9a20131ceffdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  2 09:07:15 np0005466012 podman[263109]: 2025-10-02 13:07:15.162503046 +0000 UTC m=+0.059424594 container health_status 6cc2444446a976e5df1956dec11703ac971a43cc9ba68087c32f334c838485f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm)
Oct  2 09:07:15 np0005466012 podman[263112]: 2025-10-02 13:07:15.175114204 +0000 UTC m=+0.078031174 container health_status fd3f3965e94f50bb2457198d47e0a834caf657c5d332b7b75fb662754e26e502 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  2 09:07:15 np0005466012 podman[263111]: 2025-10-02 13:07:15.181803814 +0000 UTC m=+0.087890579 container health_status cc928c647a650bd0579456841073ceb27c2db81b0a328b9d08069613b6930b36 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  2 09:07:15 np0005466012 nova_compute[192063]: 2025-10-02 13:07:15.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:15 np0005466012 nova_compute[192063]: 2025-10-02 13:07:15.821 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:16 np0005466012 nova_compute[192063]: 2025-10-02 13:07:16.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:16 np0005466012 nova_compute[192063]: 2025-10-02 13:07:16.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:16 np0005466012 nova_compute[192063]: 2025-10-02 13:07:16.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  2 09:07:16 np0005466012 nova_compute[192063]: 2025-10-02 13:07:16.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  2 09:07:16 np0005466012 nova_compute[192063]: 2025-10-02 13:07:16.842 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  2 09:07:16 np0005466012 nova_compute[192063]: 2025-10-02 13:07:16.843 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.932 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:16 np0005466012 ceilometer_agent_compute[202878]: 2025-10-02 13:07:16.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  2 09:07:20 np0005466012 nova_compute[192063]: 2025-10-02 13:07:20.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:20 np0005466012 nova_compute[192063]: 2025-10-02 13:07:20.822 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:20 np0005466012 nova_compute[192063]: 2025-10-02 13:07:20.822 2 DEBUG nova.compute.manager [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  2 09:07:21 np0005466012 nova_compute[192063]: 2025-10-02 13:07:21.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:25 np0005466012 nova_compute[192063]: 2025-10-02 13:07:25.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:26 np0005466012 nova_compute[192063]: 2025-10-02 13:07:26.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:27 np0005466012 podman[263197]: 2025-10-02 13:07:27.160721923 +0000 UTC m=+0.059508887 container health_status 6d378ae9a7d3e1186631a1e7a1d67bb985113d099b718ac9932787ea38a739c4 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  2 09:07:27 np0005466012 podman[263198]: 2025-10-02 13:07:27.1784953 +0000 UTC m=+0.065259281 container health_status a023e414464048861e5c2b8e8242ae418648ef4e70b282405aff864e05602410 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  2 09:07:30 np0005466012 nova_compute[192063]: 2025-10-02 13:07:30.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:31 np0005466012 nova_compute[192063]: 2025-10-02 13:07:31.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:32 np0005466012 podman[263239]: 2025-10-02 13:07:32.155061649 +0000 UTC m=+0.059051245 container health_status a44f182be0a139ba3380e6b9763e2c2fcd443212fce3638abc034b35bed29798 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  2 09:07:32 np0005466012 podman[263238]: 2025-10-02 13:07:32.17485763 +0000 UTC m=+0.075277330 container health_status 600a2ce327e74d5d80d60439677f4a15076851164a6708c290524e3fccfdd7cf (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  2 09:07:34 np0005466012 systemd-logind[827]: New session 49 of user zuul.
Oct  2 09:07:34 np0005466012 systemd[1]: Started Session 49 of User zuul.
Oct  2 09:07:35 np0005466012 nova_compute[192063]: 2025-10-02 13:07:35.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:35 np0005466012 nova_compute[192063]: 2025-10-02 13:07:35.817 2 DEBUG oslo_service.periodic_task [None req-9bc115d1-935d-4031-80f3-f27437b4b93b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  2 09:07:36 np0005466012 nova_compute[192063]: 2025-10-02 13:07:36.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:38 np0005466012 ovs-vsctl[263455]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  2 09:07:39 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  2 09:07:39 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  2 09:07:39 np0005466012 virtqemud[191783]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  2 09:07:40 np0005466012 nova_compute[192063]: 2025-10-02 13:07:40.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:41 np0005466012 nova_compute[192063]: 2025-10-02 13:07:41.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  2 09:07:42 np0005466012 systemd[1]: Starting Hostname Service...
Oct  2 09:07:42 np0005466012 systemd[1]: Started Hostname Service.
